Mar 19 09:17:00.291078 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:17:00.906903 master-0 kubenswrapper[4010]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:00.906903 master-0 kubenswrapper[4010]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:17:00.906903 master-0 kubenswrapper[4010]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:00.906903 master-0 kubenswrapper[4010]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:00.906903 master-0 kubenswrapper[4010]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:17:00.906903 master-0 kubenswrapper[4010]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:17:00.908341 master-0 kubenswrapper[4010]: I0319 09:17:00.908112 4010 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915579 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915622 4010 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915633 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915643 4010 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915653 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915665 4010 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915677 4010 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915686 4010 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:00.915671 master-0 kubenswrapper[4010]: W0319 09:17:00.915696 4010 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915705 4010 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915715 4010 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915723 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915733 4010 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915756 4010 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915766 4010 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915778 4010 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915788 4010 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915800 4010 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915811 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915821 4010 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915830 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915839 4010 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915848 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915856 4010 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915865 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915873 4010 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915881 4010 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:00.916191 master-0 kubenswrapper[4010]: W0319 09:17:00.915890 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915898 4010 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915906 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915915 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915926 4010 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915952 4010 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915964 4010 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915975 4010 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915986 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.915995 4010 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916004 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916012 4010 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916021 4010 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916029 4010 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916038 4010 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916047 4010 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916056 4010 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916065 4010 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916074 4010 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916082 4010 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:00.917119 master-0 kubenswrapper[4010]: W0319 09:17:00.916091 4010 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916099 4010 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916108 4010 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916116 4010 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916125 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916133 4010 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916144 4010 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916155 4010 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916168 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916177 4010 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916188 4010 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916197 4010 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916206 4010 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916215 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916224 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916236 4010 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916247 4010 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916258 4010 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916269 4010 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916280 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:00.918215 master-0 kubenswrapper[4010]: W0319 09:17:00.916291 4010 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: W0319 09:17:00.916302 4010 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: W0319 09:17:00.916314 4010 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: W0319 09:17:00.916325 4010 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: W0319 09:17:00.916335 4010 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917509 4010 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917541 4010 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917562 4010 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917576 4010 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917587 4010 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917598 4010 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917610 4010 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917622 4010 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917634 4010 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917644 4010 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917655 4010 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917665 4010 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917675 4010 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917685 4010 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917694 4010 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917704 4010 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917713 4010 flags.go:64] FLAG: --cloud-config="" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917722 4010 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917731 4010 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:17:00.919413 master-0 kubenswrapper[4010]: I0319 09:17:00.917744 4010 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917753 4010 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917764 4010 flags.go:64] FLAG: --config-dir="" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917773 4010 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917784 4010 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917796 4010 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917805 4010 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917815 4010 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917825 4010 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917834 4010 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917846 4010 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917856 4010 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917866 4010 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917876 4010 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917889 4010 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917899 4010 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917909 4010 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917919 4010 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917929 4010 flags.go:64] FLAG: --enable-server="true" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917979 4010 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.917994 4010 flags.go:64] FLAG: --event-burst="100" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.918004 4010 flags.go:64] FLAG: --event-qps="50" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.918014 4010 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.918024 4010 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.918034 4010 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:17:00.920603 master-0 kubenswrapper[4010]: I0319 09:17:00.918047 4010 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918057 4010 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918067 4010 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918077 4010 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918086 4010 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918096 4010 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918106 4010 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918116 4010 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918125 4010 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918134 4010 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918144 4010 flags.go:64] FLAG: --feature-gates="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918155 4010 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918165 4010 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918175 4010 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918185 4010 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918195 4010 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918205 4010 flags.go:64] FLAG: --help="false" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918215 4010 flags.go:64] FLAG: --hostname-override="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918226 4010 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918236 4010 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918247 4010 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918257 4010 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918267 4010 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918276 4010 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918286 4010 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918295 4010 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:17:00.921823 master-0 kubenswrapper[4010]: I0319 09:17:00.918305 4010 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918314 4010 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918325 4010 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918334 4010 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918344 4010 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918353 4010 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918363 4010 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918373 4010 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918382 4010 flags.go:64] FLAG: --lock-file="" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918392 4010 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918402 4010 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918413 4010 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918427 4010 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918436 4010 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918446 4010 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918455 4010 flags.go:64] FLAG: --logging-format="text" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918465 4010 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918509 4010 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918519 4010 flags.go:64] FLAG: --manifest-url="" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918529 4010 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918541 4010 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918551 4010 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918563 4010 flags.go:64] FLAG: --max-pods="110" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918573 4010 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918583 4010 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:17:00.923249 master-0 kubenswrapper[4010]: I0319 09:17:00.918593 4010 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918605 4010 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918616 4010 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918627 4010 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918637 4010 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918659 4010 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918670 4010 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918680 4010 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918690 4010 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918699 4010 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918712 4010 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918722 4010 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918732 4010 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918741 4010 flags.go:64] FLAG: --port="10250" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918752 4010 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918761 4010 flags.go:64] FLAG: --provider-id="" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918771 4010 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918781 4010 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918791 4010 flags.go:64] FLAG: --register-node="true" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918800 4010 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918810 4010 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918827 4010 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918837 4010 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918846 4010 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:17:00.924453 master-0 kubenswrapper[4010]: I0319 09:17:00.918855 4010 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918867 4010 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918876 4010 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918887 4010 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918896 4010 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918906 4010 flags.go:64] FLAG: --runonce="false" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918915 4010 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918925 4010 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918935 4010 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918945 4010 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918955 4010 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918965 4010 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918974 4010 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918984 4010 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.918994 4010 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919003 4010 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919013 4010 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919022 4010 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919033 4010 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919043 4010 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919053 4010 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919067 4010 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919077 4010 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919087 4010 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919098 4010 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:17:00.925745 master-0 kubenswrapper[4010]: I0319 09:17:00.919109 4010 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919118 4010 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919127 4010 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919137 4010 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919146 4010 flags.go:64] FLAG: --v="2" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919159 4010 flags.go:64] FLAG: --version="false" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919170 4010 flags.go:64] FLAG: --vmodule="" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919182 4010 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: I0319 09:17:00.919193 4010 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919410 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919422 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919433 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919446 4010 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919456 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919494 4010 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919505 4010 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919516 4010 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919524 4010 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919533 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919541 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919550 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:00.927013 master-0 kubenswrapper[4010]: W0319 09:17:00.919558 4010 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919569 4010 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919580 4010 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919590 4010 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919600 4010 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919609 4010 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919618 4010 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919626 4010 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919635 4010 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919643 4010 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919652 4010 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919661 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919669 4010 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919677 4010 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919685 4010 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919694 4010 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919703 4010 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919711 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919720 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:00.928016 master-0 kubenswrapper[4010]: W0319 09:17:00.919728 4010 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919737 4010 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919746 4010 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919754 4010 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919764 4010 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919772 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919783 4010 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919793 4010 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919802 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919810 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919819 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919827 4010 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919837 4010 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919845 4010 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919856 4010 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919866 4010 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919876 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919885 4010 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919893 4010 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:00.928934 master-0 kubenswrapper[4010]: W0319 09:17:00.919902 4010 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919910 4010 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919918 4010 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919927 4010 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919935 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919944 4010 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919952 4010 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919960 4010 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919969 4010 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919977 4010 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919985 4010 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.919993 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920002 4010 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920010 4010 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920019 4010 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920028 4010 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920036 4010 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920044 4010 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920053 4010 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920061 4010 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:00.929981 master-0 kubenswrapper[4010]: W0319 09:17:00.920072 4010 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:00.931050 master-0 kubenswrapper[4010]: W0319 09:17:00.920081 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:00.931050 master-0 kubenswrapper[4010]: I0319 09:17:00.921042 4010 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:17:00.932814 master-0 kubenswrapper[4010]: I0319 09:17:00.932741 4010 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 09:17:00.932814 master-0 kubenswrapper[4010]: I0319 09:17:00.932787 4010 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932881 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932892 4010 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932899 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932907 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932913 4010 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932918 4010 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932922 4010 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932927 4010 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932933 4010 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932938 4010 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932942 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932948 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932952 4010 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932957 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932962 4010 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932967 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932972 4010 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932978 4010 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932984 4010 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:00.932965 master-0 kubenswrapper[4010]: W0319 09:17:00.932990 4010 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.932996 4010 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933001 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933007 4010 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933013 4010 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933019 4010 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933024 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933029 4010 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933034 4010 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933039 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933054 4010 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933059 4010 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933065 4010 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933070 4010 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933075 4010 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933081 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933086 4010 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933091 4010 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933096 4010 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933101 4010 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:00.934038 master-0 kubenswrapper[4010]: W0319 09:17:00.933107 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933112 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933117 4010 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933124 4010 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933133 4010 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933138 4010 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933145 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933151 4010 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933156 4010 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933162 4010 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933169 4010 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933176 4010 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933186 4010 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933195 4010 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933206 4010 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933213 4010 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933220 4010 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933227 4010 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933234 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:00.935081 master-0 kubenswrapper[4010]: W0319 09:17:00.933241 4010 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933247 4010 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933253 4010 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933260 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933265 4010 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933272 4010 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933278 4010 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933285 4010 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933291 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933296 4010 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933301 4010 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933306 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933311 4010 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933316 4010 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: I0319 09:17:00.933325 4010 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:17:00.936436 master-0 kubenswrapper[4010]: W0319 09:17:00.933524 4010 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933537 4010 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933543 4010 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933548 4010 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933552 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933557 4010 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933562 4010 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933567 4010 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933573 4010 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933578 4010 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933583 4010 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933588 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933593 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933600 4010 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933605 4010 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933610 4010 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933615 4010 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933621 4010 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933626 4010 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933631 4010 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:17:00.937239 master-0 kubenswrapper[4010]: W0319 09:17:00.933636 4010 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933640 4010 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933645 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933651 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933655 4010 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933660 4010 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933665 4010 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933671 4010 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933678 4010 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933683 4010 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933688 4010 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933693 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933699 4010 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933705 4010 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933711 4010 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933717 4010 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933722 4010 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933728 4010 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933733 4010 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933739 4010 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:17:00.938371 master-0 kubenswrapper[4010]: W0319 09:17:00.933743 4010 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933748 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933753 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933760 4010 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933765 4010 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933771 4010 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933776 4010 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933782 4010 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933788 4010 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933794 4010 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933799 4010 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933804 4010 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933811 4010 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933817 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933823 4010 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933828 4010 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933834 4010 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933839 4010 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933844 4010 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:17:00.939521 master-0 kubenswrapper[4010]: W0319 09:17:00.933849 4010 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933855 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933860 4010 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933865 4010 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933870 4010 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933875 4010 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933880 4010 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933885 4010 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933890 4010 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933895 4010 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933900 4010 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933905 4010 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: W0319 09:17:00.933910 4010 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: I0319 09:17:00.933918 4010 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:17:00.940582 master-0 kubenswrapper[4010]: I0319 09:17:00.934158 4010 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:17:00.941327 master-0 kubenswrapper[4010]: I0319 09:17:00.940764 4010 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 09:17:00.942071 master-0 kubenswrapper[4010]: I0319 09:17:00.942020 4010 server.go:997] "Starting client certificate rotation" Mar 19 09:17:00.942071 master-0 kubenswrapper[4010]: I0319 09:17:00.942055 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:17:00.942390 master-0 kubenswrapper[4010]: I0319 09:17:00.942322 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:00.971832 master-0 kubenswrapper[4010]: I0319 09:17:00.971735 4010 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:17:00.977856 master-0 kubenswrapper[4010]: E0319 09:17:00.977777 4010 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:00.979640 master-0 kubenswrapper[4010]: I0319 09:17:00.979583 4010 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:17:00.996437 master-0 kubenswrapper[4010]: I0319 09:17:00.996345 4010 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:17:01.001371 master-0 kubenswrapper[4010]: I0319 09:17:01.001316 4010 log.go:25] "Validated CRI v1 image API" Mar 19 09:17:01.004146 master-0 kubenswrapper[4010]: I0319 09:17:01.004091 4010 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:17:01.011148 master-0 kubenswrapper[4010]: I0319 09:17:01.011065 4010 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 a870f5cc-57ed-47cd-b7c0-f85f1fc0e63d:/dev/vda3] Mar 19 09:17:01.011148 master-0 kubenswrapper[4010]: I0319 09:17:01.011111 4010 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 19 09:17:01.038333 master-0 kubenswrapper[4010]: I0319 09:17:01.037782 4010 manager.go:217] Machine: {Timestamp:2026-03-19 09:17:01.034738055 +0000 UTC m=+0.560682742 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:81766350eaa9426e82b63b9a7bdd6612 SystemUUID:81766350-eaa9-426e-82b6-3b9a7bdd6612 BootID:183da118-c1b7-4287-af5d-a72bb0b1fda1 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:d5:00:d5 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:d6:48:60:15:e4:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:17:01.038333 master-0 kubenswrapper[4010]: I0319 09:17:01.038267 4010 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:17:01.038606 master-0 kubenswrapper[4010]: I0319 09:17:01.038518 4010 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:17:01.039257 master-0 kubenswrapper[4010]: I0319 09:17:01.039210 4010 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:17:01.039713 master-0 kubenswrapper[4010]: I0319 09:17:01.039653 4010 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:17:01.040112 master-0 kubenswrapper[4010]: I0319 09:17:01.039708 4010 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:17:01.040180 master-0 kubenswrapper[4010]: I0319 09:17:01.040145 4010 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:17:01.040180 master-0 kubenswrapper[4010]: I0319 09:17:01.040169 4010 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:17:01.040370 master-0 kubenswrapper[4010]: I0319 09:17:01.040330 4010 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:17:01.040414 master-0 kubenswrapper[4010]: I0319 09:17:01.040392 4010 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:17:01.040790 master-0 kubenswrapper[4010]: I0319 09:17:01.040750 4010 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:17:01.040953 master-0 kubenswrapper[4010]: I0319 09:17:01.040916 4010 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:17:01.044762 master-0 kubenswrapper[4010]: I0319 09:17:01.044725 4010 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:17:01.044818 master-0 kubenswrapper[4010]: I0319 09:17:01.044770 4010 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:17:01.044856 master-0 kubenswrapper[4010]: I0319 09:17:01.044816 4010 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:17:01.044856 master-0 kubenswrapper[4010]: I0319 09:17:01.044838 4010 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:17:01.044914 master-0 kubenswrapper[4010]: I0319 09:17:01.044861 4010 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:17:01.049811 master-0 kubenswrapper[4010]: W0319 09:17:01.049704 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:01.049919 master-0 kubenswrapper[4010]: W0319 09:17:01.049805 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:01.050125 master-0 kubenswrapper[4010]: E0319 09:17:01.050077 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:01.050520 master-0 kubenswrapper[4010]: E0319 09:17:01.050329 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:01.050607 master-0 kubenswrapper[4010]: I0319 09:17:01.050430 4010 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 09:17:01.052625 master-0 kubenswrapper[4010]: I0319 09:17:01.052588 4010 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:17:01.052953 master-0 kubenswrapper[4010]: I0319 09:17:01.052924 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:17:01.052953 master-0 kubenswrapper[4010]: I0319 09:17:01.052952 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.052962 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.052971 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.052980 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.052989 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.052998 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.053006 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.053018 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.053028 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:17:01.053064 master-0 kubenswrapper[4010]: I0319 09:17:01.053042 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:17:01.053318 master-0 kubenswrapper[4010]: I0319 09:17:01.053126 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:17:01.054355 master-0 kubenswrapper[4010]: I0319 09:17:01.054295 4010 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:17:01.055024 master-0 kubenswrapper[4010]: I0319 09:17:01.054988 4010 server.go:1280] "Started kubelet" Mar 19 09:17:01.056277 master-0 kubenswrapper[4010]: I0319 09:17:01.056112 4010 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:17:01.056277 master-0 kubenswrapper[4010]: I0319 09:17:01.056094 4010 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:17:01.056277 master-0 kubenswrapper[4010]: I0319 09:17:01.056242 4010 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 09:17:01.056812 master-0 kubenswrapper[4010]: I0319 09:17:01.056775 4010 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:17:01.056934 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 09:17:01.057156 master-0 kubenswrapper[4010]: I0319 09:17:01.057124 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:01.063925 master-0 kubenswrapper[4010]: I0319 09:17:01.063893 4010 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:17:01.064100 master-0 kubenswrapper[4010]: I0319 09:17:01.064082 4010 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:17:01.064527 master-0 kubenswrapper[4010]: I0319 09:17:01.064464 4010 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:17:01.064527 master-0 kubenswrapper[4010]: I0319 09:17:01.064519 4010 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:17:01.064863 master-0 kubenswrapper[4010]: E0319 09:17:01.064788 4010 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:01.064863 master-0 kubenswrapper[4010]: I0319 09:17:01.064864 4010 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 09:17:01.065343 master-0 kubenswrapper[4010]: I0319 09:17:01.065304 4010 server.go:449] "Adding debug handlers to kubelet server" Mar 19 09:17:01.065409 master-0 kubenswrapper[4010]: I0319 09:17:01.065359 4010 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:17:01.065519 master-0 kubenswrapper[4010]: I0319 09:17:01.065396 4010 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:17:01.066696 master-0 kubenswrapper[4010]: E0319 09:17:01.066609 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:17:01.067716 master-0 kubenswrapper[4010]: W0319 09:17:01.067533 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:01.067895 master-0 kubenswrapper[4010]: E0319 09:17:01.067827 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:01.069749 master-0 kubenswrapper[4010]: E0319 09:17:01.066750 4010 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e336714d8a7f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.054945268 +0000 UTC m=+0.580889885,LastTimestamp:2026-03-19 09:17:01.054945268 +0000 UTC m=+0.580889885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:01.072961 master-0 kubenswrapper[4010]: I0319 09:17:01.072915 4010 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:17:01.073218 master-0 kubenswrapper[4010]: I0319 09:17:01.072969 4010 factory.go:55] Registering systemd factory Mar 19 09:17:01.073218 master-0 kubenswrapper[4010]: I0319 09:17:01.072990 4010 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:17:01.073702 master-0 kubenswrapper[4010]: I0319 09:17:01.073666 4010 factory.go:153] Registering CRI-O factory Mar 19 09:17:01.073702 master-0 kubenswrapper[4010]: I0319 09:17:01.073702 4010 factory.go:221] Registration of the crio container factory successfully Mar 19 09:17:01.073840 master-0 kubenswrapper[4010]: I0319 09:17:01.073731 4010 factory.go:103] Registering Raw factory Mar 19 09:17:01.073840 master-0 kubenswrapper[4010]: I0319 09:17:01.073754 4010 manager.go:1196] Started watching for new ooms in manager Mar 19 09:17:01.074783 master-0 kubenswrapper[4010]: I0319 09:17:01.074705 4010 manager.go:319] Starting recovery of all containers Mar 19 09:17:01.076885 master-0 kubenswrapper[4010]: E0319 09:17:01.076838 4010 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 09:17:01.095917 master-0 kubenswrapper[4010]: I0319 09:17:01.095876 4010 manager.go:324] Recovery completed Mar 19 09:17:01.111053 master-0 kubenswrapper[4010]: I0319 09:17:01.110987 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.114247 master-0 kubenswrapper[4010]: I0319 09:17:01.114207 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.114337 master-0 kubenswrapper[4010]: I0319 09:17:01.114327 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.114391 master-0 kubenswrapper[4010]: I0319 09:17:01.114383 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.115731 master-0 kubenswrapper[4010]: I0319 09:17:01.115719 4010 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:17:01.115826 master-0 kubenswrapper[4010]: I0319 09:17:01.115815 4010 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:17:01.115902 master-0 kubenswrapper[4010]: I0319 09:17:01.115892 4010 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:17:01.119579 master-0 kubenswrapper[4010]: I0319 09:17:01.119566 4010 policy_none.go:49] "None policy: Start" Mar 19 09:17:01.122317 master-0 kubenswrapper[4010]: I0319 09:17:01.122251 4010 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:17:01.122317 master-0 kubenswrapper[4010]: I0319 09:17:01.122309 4010 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:17:01.165357 master-0 kubenswrapper[4010]: E0319 09:17:01.165276 4010 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:17:01.194035 master-0 kubenswrapper[4010]: I0319 09:17:01.193918 4010 manager.go:334] "Starting Device Plugin manager" Mar 19 09:17:01.194035 master-0 kubenswrapper[4010]: I0319 09:17:01.193960 4010 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:17:01.194035 master-0 kubenswrapper[4010]: I0319 09:17:01.193972 4010 server.go:79] "Starting device plugin registration server" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.194611 4010 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.194627 4010 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.195227 4010 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.195607 4010 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.196278 4010 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: E0319 09:17:01.196546 4010 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.224281 4010 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.225919 4010 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.225968 4010 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: I0319 09:17:01.225993 4010 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: E0319 09:17:01.226175 4010 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: W0319 09:17:01.226932 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:01.237020 master-0 kubenswrapper[4010]: E0319 09:17:01.226992 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:01.267714 master-0 kubenswrapper[4010]: E0319 09:17:01.267643 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:17:01.294911 master-0 kubenswrapper[4010]: I0319 09:17:01.294796 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.296186 master-0 kubenswrapper[4010]: I0319 09:17:01.296141 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.296186 master-0 kubenswrapper[4010]: I0319 09:17:01.296197 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.296433 master-0 kubenswrapper[4010]: I0319 09:17:01.296212 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.296433 master-0 kubenswrapper[4010]: I0319 09:17:01.296253 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:01.297433 master-0 kubenswrapper[4010]: E0319 09:17:01.297373 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:01.326627 master-0 kubenswrapper[4010]: I0319 09:17:01.326533 4010 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 09:17:01.326856 master-0 kubenswrapper[4010]: I0319 09:17:01.326686 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.328320 master-0 kubenswrapper[4010]: I0319 09:17:01.328199 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.328320 master-0 kubenswrapper[4010]: I0319 09:17:01.328253 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.328320 master-0 kubenswrapper[4010]: I0319 09:17:01.328263 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.328837 master-0 kubenswrapper[4010]: I0319 09:17:01.328425 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.329617 master-0 kubenswrapper[4010]: I0319 09:17:01.329572 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.329617 master-0 kubenswrapper[4010]: I0319 09:17:01.329613 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.329760 master-0 kubenswrapper[4010]: I0319 09:17:01.329625 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.329948 master-0 kubenswrapper[4010]: I0319 09:17:01.329873 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.330020 master-0 kubenswrapper[4010]: I0319 09:17:01.329965 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.330020 master-0 kubenswrapper[4010]: I0319 09:17:01.329899 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.330020 master-0 kubenswrapper[4010]: I0319 09:17:01.330018 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.330228 master-0 kubenswrapper[4010]: I0319 09:17:01.329973 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.330733 master-0 kubenswrapper[4010]: I0319 09:17:01.330690 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.330733 master-0 kubenswrapper[4010]: I0319 09:17:01.330723 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.330733 master-0 kubenswrapper[4010]: I0319 09:17:01.330733 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.330950 master-0 kubenswrapper[4010]: I0319 09:17:01.330931 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.330950 master-0 kubenswrapper[4010]: I0319 09:17:01.330952 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.331098 master-0 kubenswrapper[4010]: I0319 09:17:01.330962 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.331224 master-0 kubenswrapper[4010]: I0319 09:17:01.331176 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.331224 master-0 kubenswrapper[4010]: I0319 09:17:01.331206 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.331224 master-0 kubenswrapper[4010]: I0319 09:17:01.331216 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.331400 master-0 kubenswrapper[4010]: I0319 09:17:01.331297 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.331400 master-0 kubenswrapper[4010]: I0319 09:17:01.331385 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.331400 master-0 kubenswrapper[4010]: I0319 09:17:01.331404 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.331870 master-0 kubenswrapper[4010]: I0319 09:17:01.331826 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.331870 master-0 kubenswrapper[4010]: I0319 09:17:01.331857 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.331870 master-0 kubenswrapper[4010]: I0319 09:17:01.331869 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.332088 master-0 kubenswrapper[4010]: I0319 09:17:01.331910 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.332088 master-0 kubenswrapper[4010]: I0319 09:17:01.331947 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.332088 master-0 kubenswrapper[4010]: I0319 09:17:01.331957 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.332088 master-0 kubenswrapper[4010]: I0319 09:17:01.331974 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.332088 master-0 kubenswrapper[4010]: I0319 09:17:01.332051 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.332088 master-0 kubenswrapper[4010]: I0319 09:17:01.332070 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.332780 master-0 kubenswrapper[4010]: I0319 09:17:01.332743 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.332780 master-0 kubenswrapper[4010]: I0319 09:17:01.332771 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.332780 master-0 kubenswrapper[4010]: I0319 09:17:01.332781 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.332994 master-0 kubenswrapper[4010]: I0319 09:17:01.332745 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.332994 master-0 kubenswrapper[4010]: I0319 09:17:01.332887 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.332994 master-0 kubenswrapper[4010]: I0319 09:17:01.332899 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.333172 master-0 kubenswrapper[4010]: I0319 09:17:01.333084 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.333172 master-0 kubenswrapper[4010]: I0319 09:17:01.333115 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.333955 master-0 kubenswrapper[4010]: I0319 09:17:01.333901 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.334082 master-0 kubenswrapper[4010]: I0319 09:17:01.333986 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.334082 master-0 kubenswrapper[4010]: I0319 09:17:01.334014 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.367028 master-0 kubenswrapper[4010]: I0319 09:17:01.366911 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.367028 master-0 kubenswrapper[4010]: I0319 09:17:01.366965 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.367028 master-0 kubenswrapper[4010]: I0319 09:17:01.367022 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367117 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367217 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367273 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367317 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367355 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367382 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367402 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367422 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367460 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367502 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.367455 master-0 kubenswrapper[4010]: I0319 09:17:01.367521 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.368347 master-0 kubenswrapper[4010]: I0319 09:17:01.367540 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.368347 master-0 kubenswrapper[4010]: I0319 09:17:01.367558 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.368347 master-0 kubenswrapper[4010]: I0319 09:17:01.367575 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468093 master-0 kubenswrapper[4010]: I0319 09:17:01.468032 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468093 master-0 kubenswrapper[4010]: I0319 09:17:01.468080 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468266 master-0 kubenswrapper[4010]: I0319 09:17:01.468110 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468266 master-0 kubenswrapper[4010]: I0319 09:17:01.468194 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468266 master-0 kubenswrapper[4010]: I0319 09:17:01.468199 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468266 master-0 kubenswrapper[4010]: I0319 09:17:01.468263 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.468382 master-0 kubenswrapper[4010]: I0319 09:17:01.468283 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.468382 master-0 kubenswrapper[4010]: I0319 09:17:01.468286 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468382 master-0 kubenswrapper[4010]: I0319 09:17:01.468332 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.468382 master-0 kubenswrapper[4010]: I0319 09:17:01.468378 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468521 master-0 kubenswrapper[4010]: I0319 09:17:01.468413 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468521 master-0 kubenswrapper[4010]: I0319 09:17:01.468430 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.468521 master-0 kubenswrapper[4010]: I0319 09:17:01.468445 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468521 master-0 kubenswrapper[4010]: I0319 09:17:01.468500 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468521 master-0 kubenswrapper[4010]: I0319 09:17:01.468503 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468521 master-0 kubenswrapper[4010]: I0319 09:17:01.468522 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468696 master-0 kubenswrapper[4010]: I0319 09:17:01.468537 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.468696 master-0 kubenswrapper[4010]: I0319 09:17:01.468589 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468696 master-0 kubenswrapper[4010]: I0319 09:17:01.468659 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.468779 master-0 kubenswrapper[4010]: I0319 09:17:01.468715 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468779 master-0 kubenswrapper[4010]: I0319 09:17:01.468758 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.468779 master-0 kubenswrapper[4010]: I0319 09:17:01.468764 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468866 master-0 kubenswrapper[4010]: I0319 09:17:01.468798 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.468866 master-0 kubenswrapper[4010]: I0319 09:17:01.468825 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.468866 master-0 kubenswrapper[4010]: I0319 09:17:01.468843 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.468947 master-0 kubenswrapper[4010]: I0319 09:17:01.468839 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468947 master-0 kubenswrapper[4010]: I0319 09:17:01.468875 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468947 master-0 kubenswrapper[4010]: I0319 09:17:01.468897 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.468947 master-0 kubenswrapper[4010]: I0319 09:17:01.468907 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.468947 master-0 kubenswrapper[4010]: I0319 09:17:01.468941 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.469077 master-0 kubenswrapper[4010]: I0319 09:17:01.468965 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.469077 master-0 kubenswrapper[4010]: I0319 09:17:01.468979 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.469077 master-0 kubenswrapper[4010]: I0319 09:17:01.469000 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.469077 master-0 kubenswrapper[4010]: I0319 09:17:01.469022 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.497606 master-0 kubenswrapper[4010]: I0319 09:17:01.497496 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.498961 master-0 kubenswrapper[4010]: I0319 09:17:01.498898 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.499091 master-0 kubenswrapper[4010]: I0319 09:17:01.498972 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.499091 master-0 kubenswrapper[4010]: I0319 09:17:01.498999 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.499091 master-0 kubenswrapper[4010]: I0319 09:17:01.499078 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:01.500233 master-0 kubenswrapper[4010]: E0319 09:17:01.500155 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:01.669561 master-0 kubenswrapper[4010]: E0319 09:17:01.669435 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:17:01.683052 master-0 kubenswrapper[4010]: I0319 09:17:01.682887 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:01.700332 master-0 kubenswrapper[4010]: I0319 09:17:01.700300 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:17:01.708124 master-0 kubenswrapper[4010]: I0319 09:17:01.708097 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:01.746063 master-0 kubenswrapper[4010]: I0319 09:17:01.745979 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:17:01.754182 master-0 kubenswrapper[4010]: I0319 09:17:01.754126 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:17:01.900355 master-0 kubenswrapper[4010]: I0319 09:17:01.900267 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:01.901261 master-0 kubenswrapper[4010]: I0319 09:17:01.901216 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:01.901261 master-0 kubenswrapper[4010]: I0319 09:17:01.901252 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:01.901261 master-0 kubenswrapper[4010]: I0319 09:17:01.901260 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:01.901411 master-0 kubenswrapper[4010]: I0319 09:17:01.901299 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:01.902090 master-0 kubenswrapper[4010]: E0319 09:17:01.902035 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:01.975001 master-0 kubenswrapper[4010]: W0319 09:17:01.974824 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:01.975001 master-0 kubenswrapper[4010]: E0319 09:17:01.974905 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:02.059724 master-0 kubenswrapper[4010]: I0319 09:17:02.059623 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:02.367601 master-0 kubenswrapper[4010]: W0319 09:17:02.367530 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83737980b9ee109184b1d78e942cf36.slice/crio-71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15 WatchSource:0}: Error finding container 71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15: Status 404 returned error can't find the container with id 71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15 Mar 19 09:17:02.368663 master-0 kubenswrapper[4010]: W0319 09:17:02.368624 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386 WatchSource:0}: Error finding container b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386: Status 404 returned error can't find the container with id b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386 Mar 19 09:17:02.379108 master-0 kubenswrapper[4010]: I0319 09:17:02.379043 4010 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:17:02.394224 master-0 kubenswrapper[4010]: W0319 09:17:02.394146 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fac1b46a11e49501805e891baae4a9.slice/crio-68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334 WatchSource:0}: Error finding container 68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334: Status 404 returned error can't find the container with id 68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334 Mar 19 09:17:02.423040 master-0 kubenswrapper[4010]: W0319 09:17:02.422971 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1249822f86f23526277d165c0d5d3c19.slice/crio-4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931 WatchSource:0}: Error finding container 4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931: Status 404 returned error can't find the container with id 4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931 Mar 19 09:17:02.471043 master-0 kubenswrapper[4010]: E0319 09:17:02.470956 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:17:02.471784 master-0 kubenswrapper[4010]: W0319 09:17:02.471714 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f265536aba6292ead501bc9b49f327.slice/crio-c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede WatchSource:0}: Error finding container c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede: Status 404 returned error can't find the container with id c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede Mar 19 09:17:02.478815 master-0 kubenswrapper[4010]: W0319 09:17:02.478709 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:02.478934 master-0 kubenswrapper[4010]: E0319 09:17:02.478825 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:02.597767 master-0 kubenswrapper[4010]: W0319 09:17:02.597633 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:02.597767 master-0 kubenswrapper[4010]: E0319 09:17:02.597727 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:02.649592 master-0 kubenswrapper[4010]: W0319 09:17:02.649239 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:02.649592 master-0 kubenswrapper[4010]: E0319 09:17:02.649344 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:02.702501 master-0 kubenswrapper[4010]: I0319 09:17:02.702403 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:02.704176 master-0 kubenswrapper[4010]: I0319 09:17:02.704117 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:02.704176 master-0 kubenswrapper[4010]: I0319 09:17:02.704171 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:02.704342 master-0 kubenswrapper[4010]: I0319 09:17:02.704194 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:02.704342 master-0 kubenswrapper[4010]: I0319 09:17:02.704244 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:02.705250 master-0 kubenswrapper[4010]: E0319 09:17:02.705200 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:03.059819 master-0 kubenswrapper[4010]: I0319 09:17:03.059694 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:03.101465 master-0 kubenswrapper[4010]: I0319 09:17:03.101372 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:03.102805 master-0 kubenswrapper[4010]: E0319 09:17:03.102730 4010 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:03.232889 master-0 kubenswrapper[4010]: I0319 09:17:03.232770 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334"} Mar 19 09:17:03.233524 master-0 kubenswrapper[4010]: I0319 09:17:03.233455 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15"} Mar 19 09:17:03.235940 master-0 kubenswrapper[4010]: I0319 09:17:03.235890 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386"} Mar 19 09:17:03.237303 master-0 kubenswrapper[4010]: I0319 09:17:03.237274 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede"} Mar 19 09:17:03.238547 master-0 kubenswrapper[4010]: I0319 09:17:03.238504 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931"} Mar 19 09:17:03.969766 master-0 kubenswrapper[4010]: W0319 09:17:03.969684 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:03.969766 master-0 kubenswrapper[4010]: E0319 09:17:03.969770 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:04.058898 master-0 kubenswrapper[4010]: I0319 09:17:04.058836 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:04.071938 master-0 kubenswrapper[4010]: E0319 09:17:04.071894 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:17:04.305818 master-0 kubenswrapper[4010]: I0319 09:17:04.305702 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:04.306864 master-0 kubenswrapper[4010]: I0319 09:17:04.306812 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:04.306864 master-0 kubenswrapper[4010]: I0319 09:17:04.306852 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:04.306864 master-0 kubenswrapper[4010]: I0319 09:17:04.306863 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:04.307290 master-0 kubenswrapper[4010]: I0319 09:17:04.307275 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:04.308151 master-0 kubenswrapper[4010]: E0319 09:17:04.308093 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:04.562110 master-0 kubenswrapper[4010]: W0319 09:17:04.562043 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:04.562249 master-0 kubenswrapper[4010]: E0319 09:17:04.562129 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.059496 master-0 kubenswrapper[4010]: I0319 09:17:05.059403 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.063270 master-0 kubenswrapper[4010]: W0319 09:17:05.063222 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.063348 master-0 kubenswrapper[4010]: E0319 09:17:05.063282 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:05.245397 master-0 kubenswrapper[4010]: I0319 09:17:05.245303 4010 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="c218293403aa861a38085870e890bceedfe5394df8d5e259c54d305af3fdeae9" exitCode=0 Mar 19 09:17:05.245397 master-0 kubenswrapper[4010]: I0319 09:17:05.245388 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"c218293403aa861a38085870e890bceedfe5394df8d5e259c54d305af3fdeae9"} Mar 19 09:17:05.246291 master-0 kubenswrapper[4010]: I0319 09:17:05.245419 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:05.246899 master-0 kubenswrapper[4010]: I0319 09:17:05.246865 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:05.246899 master-0 kubenswrapper[4010]: I0319 09:17:05.246898 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:05.247003 master-0 kubenswrapper[4010]: I0319 09:17:05.246913 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:05.452312 master-0 kubenswrapper[4010]: W0319 09:17:05.452256 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:05.452312 master-0 kubenswrapper[4010]: E0319 09:17:05.452301 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:06.059213 master-0 kubenswrapper[4010]: I0319 09:17:06.059132 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:06.248887 master-0 kubenswrapper[4010]: I0319 09:17:06.248848 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 09:17:06.249626 master-0 kubenswrapper[4010]: I0319 09:17:06.249274 4010 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="384d69f9ada7cda8556802257cd31e0721d0966b3de711e4d62ed3f256aced54" exitCode=1 Mar 19 09:17:06.249626 master-0 kubenswrapper[4010]: I0319 09:17:06.249338 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"384d69f9ada7cda8556802257cd31e0721d0966b3de711e4d62ed3f256aced54"} Mar 19 09:17:06.249626 master-0 kubenswrapper[4010]: I0319 09:17:06.249349 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:06.249939 master-0 kubenswrapper[4010]: I0319 09:17:06.249907 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:06.249939 master-0 kubenswrapper[4010]: I0319 09:17:06.249934 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:06.250028 master-0 kubenswrapper[4010]: I0319 09:17:06.249944 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:06.250213 master-0 kubenswrapper[4010]: I0319 09:17:06.250193 4010 scope.go:117] "RemoveContainer" containerID="384d69f9ada7cda8556802257cd31e0721d0966b3de711e4d62ed3f256aced54" Mar 19 09:17:06.251298 master-0 kubenswrapper[4010]: I0319 09:17:06.251260 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a"} Mar 19 09:17:06.251298 master-0 kubenswrapper[4010]: I0319 09:17:06.251284 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b"} Mar 19 09:17:06.251298 master-0 kubenswrapper[4010]: I0319 09:17:06.251293 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:06.251880 master-0 kubenswrapper[4010]: I0319 09:17:06.251845 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:06.251940 master-0 kubenswrapper[4010]: I0319 09:17:06.251884 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:06.251940 master-0 kubenswrapper[4010]: I0319 09:17:06.251897 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:07.059597 master-0 kubenswrapper[4010]: I0319 09:17:07.059548 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:07.234168 master-0 kubenswrapper[4010]: I0319 09:17:07.234121 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:07.235843 master-0 kubenswrapper[4010]: E0319 09:17:07.235795 4010 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:07.256039 master-0 kubenswrapper[4010]: I0319 09:17:07.255996 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 09:17:07.256795 master-0 kubenswrapper[4010]: I0319 09:17:07.256762 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 09:17:07.257292 master-0 kubenswrapper[4010]: I0319 09:17:07.257261 4010 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="0e597853286cd578d7724724b018289225861b5b1a0d3344e56ed5aa003721a7" exitCode=1 Mar 19 09:17:07.257365 master-0 kubenswrapper[4010]: I0319 09:17:07.257350 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:07.257411 master-0 kubenswrapper[4010]: I0319 09:17:07.257379 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:07.257491 master-0 kubenswrapper[4010]: I0319 09:17:07.257330 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"0e597853286cd578d7724724b018289225861b5b1a0d3344e56ed5aa003721a7"} Mar 19 09:17:07.257539 master-0 kubenswrapper[4010]: I0319 09:17:07.257522 4010 scope.go:117] "RemoveContainer" containerID="384d69f9ada7cda8556802257cd31e0721d0966b3de711e4d62ed3f256aced54" Mar 19 09:17:07.258259 master-0 kubenswrapper[4010]: I0319 09:17:07.258233 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:07.258259 master-0 kubenswrapper[4010]: I0319 09:17:07.258256 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:07.258330 master-0 kubenswrapper[4010]: I0319 09:17:07.258258 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:07.258330 master-0 kubenswrapper[4010]: I0319 09:17:07.258264 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:07.258379 master-0 kubenswrapper[4010]: I0319 09:17:07.258280 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:07.258431 master-0 kubenswrapper[4010]: I0319 09:17:07.258408 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:07.258817 master-0 kubenswrapper[4010]: I0319 09:17:07.258788 4010 scope.go:117] "RemoveContainer" containerID="0e597853286cd578d7724724b018289225861b5b1a0d3344e56ed5aa003721a7" Mar 19 09:17:07.258979 master-0 kubenswrapper[4010]: E0319 09:17:07.258955 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:07.275890 master-0 kubenswrapper[4010]: E0319 09:17:07.275805 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:17:07.508440 master-0 kubenswrapper[4010]: I0319 09:17:07.508392 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:07.509643 master-0 kubenswrapper[4010]: I0319 09:17:07.509611 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:07.509709 master-0 kubenswrapper[4010]: I0319 09:17:07.509652 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:07.509709 master-0 kubenswrapper[4010]: I0319 09:17:07.509665 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:07.509709 master-0 kubenswrapper[4010]: I0319 09:17:07.509711 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:07.510742 master-0 kubenswrapper[4010]: E0319 09:17:07.510716 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:08.059169 master-0 kubenswrapper[4010]: I0319 09:17:08.059106 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:08.209505 master-0 kubenswrapper[4010]: W0319 09:17:08.209399 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:08.209505 master-0 kubenswrapper[4010]: E0319 09:17:08.209512 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:08.260557 master-0 kubenswrapper[4010]: I0319 09:17:08.260511 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 09:17:08.261055 master-0 kubenswrapper[4010]: I0319 09:17:08.260906 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:08.261586 master-0 kubenswrapper[4010]: I0319 09:17:08.261565 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:08.261649 master-0 kubenswrapper[4010]: I0319 09:17:08.261592 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:08.261649 master-0 kubenswrapper[4010]: I0319 09:17:08.261604 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:08.261916 master-0 kubenswrapper[4010]: I0319 09:17:08.261895 4010 scope.go:117] "RemoveContainer" containerID="0e597853286cd578d7724724b018289225861b5b1a0d3344e56ed5aa003721a7" Mar 19 09:17:08.262072 master-0 kubenswrapper[4010]: E0319 09:17:08.262043 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:09.059872 master-0 kubenswrapper[4010]: I0319 09:17:09.059818 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:09.100500 master-0 kubenswrapper[4010]: W0319 09:17:09.100391 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:09.100500 master-0 kubenswrapper[4010]: E0319 09:17:09.100496 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:09.817537 master-0 kubenswrapper[4010]: W0319 09:17:09.817410 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:09.817537 master-0 kubenswrapper[4010]: E0319 09:17:09.817537 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:09.839936 master-0 kubenswrapper[4010]: E0319 09:17:09.839814 4010 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e336714d8a7f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.054945268 +0000 UTC m=+0.580889885,LastTimestamp:2026-03-19 09:17:01.054945268 +0000 UTC m=+0.580889885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:10.059079 master-0 kubenswrapper[4010]: I0319 09:17:10.059015 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:10.868051 master-0 kubenswrapper[4010]: W0319 09:17:10.867947 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:10.868523 master-0 kubenswrapper[4010]: E0319 09:17:10.868064 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:17:11.059959 master-0 kubenswrapper[4010]: I0319 09:17:11.059875 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:11.197178 master-0 kubenswrapper[4010]: E0319 09:17:11.197108 4010 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:12.059245 master-0 kubenswrapper[4010]: I0319 09:17:12.059155 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:13.060560 master-0 kubenswrapper[4010]: I0319 09:17:13.060482 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:13.270778 master-0 kubenswrapper[4010]: I0319 09:17:13.270700 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266"} Mar 19 09:17:13.272107 master-0 kubenswrapper[4010]: I0319 09:17:13.272069 4010 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="a84c1f34c626f1387c9440e1656352bf22e178dc307b15faa17e2d14af155731" exitCode=0 Mar 19 09:17:13.272225 master-0 kubenswrapper[4010]: I0319 09:17:13.272119 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"a84c1f34c626f1387c9440e1656352bf22e178dc307b15faa17e2d14af155731"} Mar 19 09:17:13.272225 master-0 kubenswrapper[4010]: I0319 09:17:13.272205 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:13.273094 master-0 kubenswrapper[4010]: I0319 09:17:13.273054 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:13.273151 master-0 kubenswrapper[4010]: I0319 09:17:13.273098 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:13.273151 master-0 kubenswrapper[4010]: I0319 09:17:13.273110 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:13.274488 master-0 kubenswrapper[4010]: I0319 09:17:13.274433 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d"} Mar 19 09:17:13.274653 master-0 kubenswrapper[4010]: I0319 09:17:13.274587 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:13.275703 master-0 kubenswrapper[4010]: I0319 09:17:13.275665 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:13.275703 master-0 kubenswrapper[4010]: I0319 09:17:13.275704 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:13.275801 master-0 kubenswrapper[4010]: I0319 09:17:13.275716 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:13.276514 master-0 kubenswrapper[4010]: I0319 09:17:13.276488 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:13.277164 master-0 kubenswrapper[4010]: I0319 09:17:13.277125 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:13.277164 master-0 kubenswrapper[4010]: I0319 09:17:13.277160 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:13.277261 master-0 kubenswrapper[4010]: I0319 09:17:13.277173 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:13.677267 master-0 kubenswrapper[4010]: E0319 09:17:13.677200 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 19 09:17:13.911257 master-0 kubenswrapper[4010]: I0319 09:17:13.911077 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:13.912116 master-0 kubenswrapper[4010]: I0319 09:17:13.912080 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:13.912116 master-0 kubenswrapper[4010]: I0319 09:17:13.912119 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:13.912223 master-0 kubenswrapper[4010]: I0319 09:17:13.912129 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:13.912223 master-0 kubenswrapper[4010]: I0319 09:17:13.912187 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:13.913199 master-0 kubenswrapper[4010]: E0319 09:17:13.913130 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:17:14.059821 master-0 kubenswrapper[4010]: I0319 09:17:14.059728 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:14.276235 master-0 kubenswrapper[4010]: I0319 09:17:14.276142 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:14.277044 master-0 kubenswrapper[4010]: I0319 09:17:14.276975 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:14.277044 master-0 kubenswrapper[4010]: I0319 09:17:14.277035 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:14.277044 master-0 kubenswrapper[4010]: I0319 09:17:14.277047 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:15.059626 master-0 kubenswrapper[4010]: I0319 09:17:15.059530 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:17:15.282168 master-0 kubenswrapper[4010]: I0319 09:17:15.281985 4010 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266" exitCode=1 Mar 19 09:17:15.282168 master-0 kubenswrapper[4010]: I0319 09:17:15.282094 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266"} Mar 19 09:17:15.284839 master-0 kubenswrapper[4010]: I0319 09:17:15.284783 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"7716e20f21898d48a97cdc11ca530decd4b56cabb9557337c593d6dc0a3abe47"} Mar 19 09:17:15.779935 master-0 kubenswrapper[4010]: I0319 09:17:15.778804 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:16.613354 master-0 kubenswrapper[4010]: I0319 09:17:16.611936 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:16.617223 master-0 kubenswrapper[4010]: E0319 09:17:16.616944 4010 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: certificatesigningrequests.certificates.k8s.io is forbidden: User \"system:serviceaccount:openshift-machine-config-operator:node-bootstrapper\" cannot create resource \"certificatesigningrequests\" in API group \"certificates.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:17.063530 master-0 kubenswrapper[4010]: I0319 09:17:17.063491 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:17.547660 master-0 kubenswrapper[4010]: W0319 09:17:17.547580 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:17.547660 master-0 kubenswrapper[4010]: E0319 09:17:17.547626 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:18.063282 master-0 kubenswrapper[4010]: I0319 09:17:18.063237 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:18.781253 master-0 kubenswrapper[4010]: W0319 09:17:18.781214 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 09:17:18.781458 master-0 kubenswrapper[4010]: E0319 09:17:18.781260 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:18.803899 master-0 kubenswrapper[4010]: W0319 09:17:18.803858 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 09:17:18.804073 master-0 kubenswrapper[4010]: E0319 09:17:18.803909 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:19.063003 master-0 kubenswrapper[4010]: I0319 09:17:19.062686 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:19.226418 master-0 kubenswrapper[4010]: I0319 09:17:19.226343 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:19.227603 master-0 kubenswrapper[4010]: I0319 09:17:19.227541 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:19.227603 master-0 kubenswrapper[4010]: I0319 09:17:19.227605 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:19.227825 master-0 kubenswrapper[4010]: I0319 09:17:19.227617 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:19.227921 master-0 kubenswrapper[4010]: I0319 09:17:19.227900 4010 scope.go:117] "RemoveContainer" containerID="0e597853286cd578d7724724b018289225861b5b1a0d3344e56ed5aa003721a7" Mar 19 09:17:19.300309 master-0 kubenswrapper[4010]: I0319 09:17:19.300265 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a7909254e1fd575ef7a679770eb6617922c50b1fbb682ef07075bcdacdc5e021"} Mar 19 09:17:19.300535 master-0 kubenswrapper[4010]: I0319 09:17:19.300357 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:19.301191 master-0 kubenswrapper[4010]: I0319 09:17:19.301156 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:19.301191 master-0 kubenswrapper[4010]: I0319 09:17:19.301187 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:19.301373 master-0 kubenswrapper[4010]: I0319 09:17:19.301199 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:19.301531 master-0 kubenswrapper[4010]: I0319 09:17:19.301507 4010 scope.go:117] "RemoveContainer" containerID="45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266" Mar 19 09:17:19.302075 master-0 kubenswrapper[4010]: I0319 09:17:19.302042 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"157ec68d28f9ad49e7460cf4325702e32a61a87e98a342a6b3f00e830966c9b0"} Mar 19 09:17:19.302193 master-0 kubenswrapper[4010]: I0319 09:17:19.302159 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:19.302847 master-0 kubenswrapper[4010]: I0319 09:17:19.302813 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:19.302931 master-0 kubenswrapper[4010]: I0319 09:17:19.302839 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:19.302931 master-0 kubenswrapper[4010]: I0319 09:17:19.302875 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:19.844670 master-0 kubenswrapper[4010]: E0319 09:17:19.844565 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336714d8a7f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.054945268 +0000 UTC m=+0.580889885,LastTimestamp:2026-03-19 09:17:01.054945268 +0000 UTC m=+0.580889885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.847869 master-0 kubenswrapper[4010]: E0319 09:17:19.847796 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.850874 master-0 kubenswrapper[4010]: E0319 09:17:19.850810 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.853959 master-0 kubenswrapper[4010]: E0319 09:17:19.853893 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.856984 master-0 kubenswrapper[4010]: E0319 09:17:19.856915 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671d6a508f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.198708879 +0000 UTC m=+0.724653496,LastTimestamp:2026-03-19 09:17:01.198708879 +0000 UTC m=+0.724653496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.860488 master-0 kubenswrapper[4010]: E0319 09:17:19.860420 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.296178253 +0000 UTC m=+0.822122870,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.863263 master-0 kubenswrapper[4010]: E0319 09:17:19.863206 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.29620742 +0000 UTC m=+0.822152037,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.866205 master-0 kubenswrapper[4010]: E0319 09:17:19.866150 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e33671864d45b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.296220789 +0000 UTC m=+0.822165406,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.872992 master-0 kubenswrapper[4010]: E0319 09:17:19.872887 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.328233411 +0000 UTC m=+0.854178018,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.877430 master-0 kubenswrapper[4010]: E0319 09:17:19.877342 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.328259838 +0000 UTC m=+0.854204445,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.881847 master-0 kubenswrapper[4010]: E0319 09:17:19.881757 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e33671864d45b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.328267567 +0000 UTC m=+0.854212174,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.885126 master-0 kubenswrapper[4010]: E0319 09:17:19.885050 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.329598715 +0000 UTC m=+0.855543322,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.888926 master-0 kubenswrapper[4010]: E0319 09:17:19.888833 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.329621523 +0000 UTC m=+0.855566130,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.892392 master-0 kubenswrapper[4010]: E0319 09:17:19.892335 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e33671864d45b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.329632992 +0000 UTC m=+0.855577599,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.896000 master-0 kubenswrapper[4010]: E0319 09:17:19.895912 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.330713097 +0000 UTC m=+0.856657704,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.899504 master-0 kubenswrapper[4010]: E0319 09:17:19.899432 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.330729575 +0000 UTC m=+0.856674182,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.903191 master-0 kubenswrapper[4010]: E0319 09:17:19.903131 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e33671864d45b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.330739464 +0000 UTC m=+0.856684071,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.907100 master-0 kubenswrapper[4010]: E0319 09:17:19.907039 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.330945702 +0000 UTC m=+0.856890309,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.910874 master-0 kubenswrapper[4010]: E0319 09:17:19.910810 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.33095865 +0000 UTC m=+0.856903257,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.914181 master-0 kubenswrapper[4010]: E0319 09:17:19.914115 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e33671864d45b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.330969899 +0000 UTC m=+0.856914506,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.917233 master-0 kubenswrapper[4010]: E0319 09:17:19.917175 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.331197185 +0000 UTC m=+0.857141792,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.920623 master-0 kubenswrapper[4010]: E0319 09:17:19.920530 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.331212703 +0000 UTC m=+0.857157310,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.924025 master-0 kubenswrapper[4010]: E0319 09:17:19.923925 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e33671864d45b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e33671864d45b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114463323 +0000 UTC m=+0.640407930,LastTimestamp:2026-03-19 09:17:01.331223982 +0000 UTC m=+0.857168589,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.928598 master-0 kubenswrapper[4010]: E0319 09:17:19.928523 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e336718627488\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e336718627488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.11430772 +0000 UTC m=+0.640252327,LastTimestamp:2026-03-19 09:17:01.331848475 +0000 UTC m=+0.857793092,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.931822 master-0 kubenswrapper[4010]: E0319 09:17:19.931757 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3367186382ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3367186382ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:01.114376942 +0000 UTC m=+0.640321549,LastTimestamp:2026-03-19 09:17:01.331864333 +0000 UTC m=+0.857808940,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.935564 master-0 kubenswrapper[4010]: E0319 09:17:19.935498 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e336763c29b93 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:02.378900371 +0000 UTC m=+1.904845018,LastTimestamp:2026-03-19 09:17:02.378900371 +0000 UTC m=+1.904845018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.938349 master-0 kubenswrapper[4010]: E0319 09:17:19.938293 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e336763c2eb7d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:02.378920829 +0000 UTC m=+1.904865446,LastTimestamp:2026-03-19 09:17:02.378920829 +0000 UTC m=+1.904865446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.941873 master-0 kubenswrapper[4010]: E0319 09:17:19.941818 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336765a3eb54 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:02.410443604 +0000 UTC m=+1.936388221,LastTimestamp:2026-03-19 09:17:02.410443604 +0000 UTC m=+1.936388221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.945103 master-0 kubenswrapper[4010]: E0319 09:17:19.944999 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336766968244 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:02.426341956 +0000 UTC m=+1.952286563,LastTimestamp:2026-03-19 09:17:02.426341956 +0000 UTC m=+1.952286563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.948812 master-0 kubenswrapper[4010]: E0319 09:17:19.948747 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3367696eb743 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:02.474065731 +0000 UTC m=+2.000010328,LastTimestamp:2026-03-19 09:17:02.474065731 +0000 UTC m=+2.000010328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.953005 master-0 kubenswrapper[4010]: E0319 09:17:19.952946 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3367d8e57f76 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" in 1.917s (1.917s including waiting). Image size: 465090934 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:04.344121206 +0000 UTC m=+3.870065813,LastTimestamp:2026-03-19 09:17:04.344121206 +0000 UTC m=+3.870065813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.957051 master-0 kubenswrapper[4010]: E0319 09:17:19.956993 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3367e6e7eaee openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:04.579160814 +0000 UTC m=+4.105105421,LastTimestamp:2026-03-19 09:17:04.579160814 +0000 UTC m=+4.105105421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.961269 master-0 kubenswrapper[4010]: E0319 09:17:19.961201 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3367e7dca47d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:04.595199101 +0000 UTC m=+4.121143708,LastTimestamp:2026-03-19 09:17:04.595199101 +0000 UTC m=+4.121143708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.965689 master-0 kubenswrapper[4010]: E0319 09:17:19.965618 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3368134a1703 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.323792131 +0000 UTC m=+4.849736728,LastTimestamp:2026-03-19 09:17:05.323792131 +0000 UTC m=+4.849736728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.970188 master-0 kubenswrapper[4010]: E0319 09:17:19.970097 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3368173ac2bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" in 3.01s (3.01s including waiting). Image size: 529326739 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.389896379 +0000 UTC m=+4.915840986,LastTimestamp:2026-03-19 09:17:05.389896379 +0000 UTC m=+4.915840986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.974371 master-0 kubenswrapper[4010]: E0319 09:17:19.974253 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33681df7de75 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.502953077 +0000 UTC m=+5.028897684,LastTimestamp:2026-03-19 09:17:05.502953077 +0000 UTC m=+5.028897684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.978292 master-0 kubenswrapper[4010]: E0319 09:17:19.978213 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33681eaa4120 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.514643744 +0000 UTC m=+5.040588351,LastTimestamp:2026-03-19 09:17:05.514643744 +0000 UTC m=+5.040588351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.981833 master-0 kubenswrapper[4010]: E0319 09:17:19.981748 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e336821d86de7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.568001511 +0000 UTC m=+5.093946118,LastTimestamp:2026-03-19 09:17:05.568001511 +0000 UTC m=+5.093946118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.985515 master-0 kubenswrapper[4010]: E0319 09:17:19.985358 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e336822a7f48d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.581601933 +0000 UTC m=+5.107546540,LastTimestamp:2026-03-19 09:17:05.581601933 +0000 UTC m=+5.107546540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.989515 master-0 kubenswrapper[4010]: E0319 09:17:19.989387 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e336822c89aba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.583741626 +0000 UTC m=+5.109686233,LastTimestamp:2026-03-19 09:17:05.583741626 +0000 UTC m=+5.109686233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.993973 master-0 kubenswrapper[4010]: E0319 09:17:19.993873 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e33682db311c9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.766879689 +0000 UTC m=+5.292824296,LastTimestamp:2026-03-19 09:17:05.766879689 +0000 UTC m=+5.292824296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:19.998360 master-0 kubenswrapper[4010]: E0319 09:17:19.998234 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e33682e813e5e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.780391518 +0000 UTC m=+5.306336125,LastTimestamp:2026-03-19 09:17:05.780391518 +0000 UTC m=+5.306336125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.003436 master-0 kubenswrapper[4010]: E0319 09:17:20.003336 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3368134a1703\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3368134a1703 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.323792131 +0000 UTC m=+4.849736728,LastTimestamp:2026-03-19 09:17:06.252778286 +0000 UTC m=+5.778722893,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.008759 master-0 kubenswrapper[4010]: E0319 09:17:20.008563 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33681df7de75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33681df7de75 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.502953077 +0000 UTC m=+5.028897684,LastTimestamp:2026-03-19 09:17:06.463274739 +0000 UTC m=+5.989219336,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.024423 master-0 kubenswrapper[4010]: E0319 09:17:20.024308 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33681eaa4120\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33681eaa4120 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.514643744 +0000 UTC m=+5.040588351,LastTimestamp:2026-03-19 09:17:06.484821442 +0000 UTC m=+6.010766049,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.030235 master-0 kubenswrapper[4010]: E0319 09:17:20.030139 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336886a1ec3b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.258928187 +0000 UTC m=+6.784872794,LastTimestamp:2026-03-19 09:17:07.258928187 +0000 UTC m=+6.784872794,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.034168 master-0 kubenswrapper[4010]: E0319 09:17:20.034048 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e336886a1ec3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336886a1ec3b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.258928187 +0000 UTC m=+6.784872794,LastTimestamp:2026-03-19 09:17:08.262021606 +0000 UTC m=+7.787966213,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.037943 master-0 kubenswrapper[4010]: E0319 09:17:20.037853 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3369c2ddea43 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 10.185s (10.185s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.564460099 +0000 UTC m=+12.090404746,LastTimestamp:2026-03-19 09:17:12.564460099 +0000 UTC m=+12.090404746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.041345 master-0 kubenswrapper[4010]: E0319 09:17:20.041284 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3369c5c33504 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 10.202s (10.202s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.613041412 +0000 UTC m=+12.138986029,LastTimestamp:2026-03-19 09:17:12.613041412 +0000 UTC m=+12.138986029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.045570 master-0 kubenswrapper[4010]: E0319 09:17:20.045459 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3369ce7152f5 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.758670069 +0000 UTC m=+12.284614666,LastTimestamp:2026-03-19 09:17:12.758670069 +0000 UTC m=+12.284614666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.049350 master-0 kubenswrapper[4010]: E0319 09:17:20.049223 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3369cf9fd6cd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 10.304s (10.304s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.778495693 +0000 UTC m=+12.304440300,LastTimestamp:2026-03-19 09:17:12.778495693 +0000 UTC m=+12.304440300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.052930 master-0 kubenswrapper[4010]: E0319 09:17:20.052787 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3369d0fd34f7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.801391863 +0000 UTC m=+12.327336510,LastTimestamp:2026-03-19 09:17:12.801391863 +0000 UTC m=+12.327336510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.056237 master-0 kubenswrapper[4010]: E0319 09:17:20.056130 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3369d13dd899 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.805628057 +0000 UTC m=+12.331572664,LastTimestamp:2026-03-19 09:17:12.805628057 +0000 UTC m=+12.331572664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.060215 master-0 kubenswrapper[4010]: E0319 09:17:20.060078 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3369d2fea6e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:12.835040998 +0000 UTC m=+12.360985605,LastTimestamp:2026-03-19 09:17:12.835040998 +0000 UTC m=+12.360985605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.064544 master-0 kubenswrapper[4010]: I0319 09:17:20.064513 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:20.065189 master-0 kubenswrapper[4010]: E0319 09:17:20.065081 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3369dd066ed7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:13.003323095 +0000 UTC m=+12.529267702,LastTimestamp:2026-03-19 09:17:13.003323095 +0000 UTC m=+12.529267702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.067806 master-0 kubenswrapper[4010]: E0319 09:17:20.067687 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3369de07f845 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:13.020201029 +0000 UTC m=+12.546145646,LastTimestamp:2026-03-19 09:17:13.020201029 +0000 UTC m=+12.546145646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.072020 master-0 kubenswrapper[4010]: E0319 09:17:20.071882 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3369de18c00b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:13.021300747 +0000 UTC m=+12.547245354,LastTimestamp:2026-03-19 09:17:13.021300747 +0000 UTC m=+12.547245354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.076524 master-0 kubenswrapper[4010]: E0319 09:17:20.076422 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3369ed4d527d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:13.276404349 +0000 UTC m=+12.802348956,LastTimestamp:2026-03-19 09:17:13.276404349 +0000 UTC m=+12.802348956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.080857 master-0 kubenswrapper[4010]: E0319 09:17:20.080747 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336a5cc8448a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:15.146732682 +0000 UTC m=+14.672677329,LastTimestamp:2026-03-19 09:17:15.146732682 +0000 UTC m=+14.672677329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.085023 master-0 kubenswrapper[4010]: E0319 09:17:20.084884 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336a6d90f8ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:15.428321535 +0000 UTC m=+14.954266172,LastTimestamp:2026-03-19 09:17:15.428321535 +0000 UTC m=+14.954266172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.089581 master-0 kubenswrapper[4010]: E0319 09:17:20.089459 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336a6daaa40a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:15.430003722 +0000 UTC m=+14.955948369,LastTimestamp:2026-03-19 09:17:15.430003722 +0000 UTC m=+14.955948369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.094658 master-0 kubenswrapper[4010]: E0319 09:17:20.094527 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b3f5bba0c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\" in 5.926s (5.926s including waiting). Image size: 505246690 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:18.948047372 +0000 UTC m=+18.473991979,LastTimestamp:2026-03-19 09:17:18.948047372 +0000 UTC m=+18.473991979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.099543 master-0 kubenswrapper[4010]: E0319 09:17:20.099373 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336b40042647 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" in 3.529s (3.529s including waiting). Image size: 514984269 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:18.959085127 +0000 UTC m=+18.485029744,LastTimestamp:2026-03-19 09:17:18.959085127 +0000 UTC m=+18.485029744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.103906 master-0 kubenswrapper[4010]: E0319 09:17:20.103769 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b48adc641 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.104419393 +0000 UTC m=+18.630363990,LastTimestamp:2026-03-19 09:17:19.104419393 +0000 UTC m=+18.630363990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.111637 master-0 kubenswrapper[4010]: E0319 09:17:20.111519 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336b48c86e6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.106166383 +0000 UTC m=+18.632111000,LastTimestamp:2026-03-19 09:17:19.106166383 +0000 UTC m=+18.632111000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.117140 master-0 kubenswrapper[4010]: E0319 09:17:20.117040 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b493a26ca kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.113619146 +0000 UTC m=+18.639563753,LastTimestamp:2026-03-19 09:17:19.113619146 +0000 UTC m=+18.639563753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.124019 master-0 kubenswrapper[4010]: E0319 09:17:20.123905 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e336b495089ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.115086266 +0000 UTC m=+18.641030873,LastTimestamp:2026-03-19 09:17:19.115086266 +0000 UTC m=+18.641030873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.129089 master-0 kubenswrapper[4010]: E0319 09:17:20.128939 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3368134a1703\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3368134a1703 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.323792131 +0000 UTC m=+4.849736728,LastTimestamp:2026-03-19 09:17:19.23212005 +0000 UTC m=+18.758064657,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.133924 master-0 kubenswrapper[4010]: E0319 09:17:20.133822 4010 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e336b54957dc1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:19.304154561 +0000 UTC m=+18.830099168,LastTimestamp:2026-03-19 09:17:19.304154561 +0000 UTC m=+18.830099168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.134188 master-0 kubenswrapper[4010]: I0319 09:17:20.134154 4010 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:20.137354 master-0 kubenswrapper[4010]: E0319 09:17:20.137289 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33681df7de75\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33681df7de75 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.502953077 +0000 UTC m=+5.028897684,LastTimestamp:2026-03-19 09:17:19.421382551 +0000 UTC m=+18.947327188,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.141548 master-0 kubenswrapper[4010]: E0319 09:17:20.141425 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e33681eaa4120\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e33681eaa4120 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:05.514643744 +0000 UTC m=+5.040588351,LastTimestamp:2026-03-19 09:17:19.435302414 +0000 UTC m=+18.961247021,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.145295 master-0 kubenswrapper[4010]: E0319 09:17:20.145193 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189e3369dd066ed7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3369dd066ed7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:13.003323095 +0000 UTC m=+12.529267702,LastTimestamp:2026-03-19 09:17:19.494791916 +0000 UTC m=+19.020736543,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.150952 master-0 kubenswrapper[4010]: E0319 09:17:20.150822 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189e3369de07f845\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3369de07f845 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:13.020201029 +0000 UTC m=+12.546145646,LastTimestamp:2026-03-19 09:17:19.50902905 +0000 UTC m=+19.034973667,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.158049 master-0 kubenswrapper[4010]: I0319 09:17:20.157969 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:20.306357 master-0 kubenswrapper[4010]: I0319 09:17:20.306315 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"f432083e0bbefbf0b796c955a8b8a3248de20b6a5a5f87ee1ff2f03234e367ae"} Mar 19 09:17:20.306957 master-0 kubenswrapper[4010]: I0319 09:17:20.306936 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:20.307696 master-0 kubenswrapper[4010]: I0319 09:17:20.307676 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:20.307809 master-0 kubenswrapper[4010]: I0319 09:17:20.307795 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:20.307884 master-0 kubenswrapper[4010]: I0319 09:17:20.307858 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:17:20.307959 master-0 kubenswrapper[4010]: I0319 09:17:20.307944 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:20.308379 master-0 kubenswrapper[4010]: I0319 09:17:20.308355 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 09:17:20.308698 master-0 kubenswrapper[4010]: I0319 09:17:20.308677 4010 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11" exitCode=1 Mar 19 09:17:20.308805 master-0 kubenswrapper[4010]: I0319 09:17:20.308747 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11"} Mar 19 09:17:20.308860 master-0 kubenswrapper[4010]: I0319 09:17:20.308825 4010 scope.go:117] "RemoveContainer" containerID="0e597853286cd578d7724724b018289225861b5b1a0d3344e56ed5aa003721a7" Mar 19 09:17:20.308949 master-0 kubenswrapper[4010]: I0319 09:17:20.308932 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:20.309026 master-0 kubenswrapper[4010]: I0319 09:17:20.309014 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:20.309558 master-0 kubenswrapper[4010]: I0319 09:17:20.309538 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:20.309620 master-0 kubenswrapper[4010]: I0319 09:17:20.309557 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:20.309620 master-0 kubenswrapper[4010]: I0319 09:17:20.309576 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:20.309620 master-0 kubenswrapper[4010]: I0319 09:17:20.309586 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:20.309776 master-0 kubenswrapper[4010]: I0319 09:17:20.309560 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:20.309809 master-0 kubenswrapper[4010]: I0319 09:17:20.309778 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:20.310029 master-0 kubenswrapper[4010]: I0319 09:17:20.310011 4010 scope.go:117] "RemoveContainer" containerID="c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11" Mar 19 09:17:20.310150 master-0 kubenswrapper[4010]: E0319 09:17:20.310130 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:20.316932 master-0 kubenswrapper[4010]: E0319 09:17:20.316812 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e336886a1ec3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336886a1ec3b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.258928187 +0000 UTC m=+6.784872794,LastTimestamp:2026-03-19 09:17:20.310107963 +0000 UTC m=+19.836052570,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:20.682018 master-0 kubenswrapper[4010]: E0319 09:17:20.681943 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:17:20.913772 master-0 kubenswrapper[4010]: I0319 09:17:20.913700 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:20.914756 master-0 kubenswrapper[4010]: I0319 09:17:20.914665 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:20.914756 master-0 kubenswrapper[4010]: I0319 09:17:20.914706 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:20.914756 master-0 kubenswrapper[4010]: I0319 09:17:20.914718 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:20.914756 master-0 kubenswrapper[4010]: I0319 09:17:20.914750 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:20.919604 master-0 kubenswrapper[4010]: E0319 09:17:20.919571 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 09:17:20.957741 master-0 kubenswrapper[4010]: I0319 09:17:20.957612 4010 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:20.962700 master-0 kubenswrapper[4010]: I0319 09:17:20.962658 4010 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:21.063038 master-0 kubenswrapper[4010]: I0319 09:17:21.062918 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:21.197889 master-0 kubenswrapper[4010]: E0319 09:17:21.197817 4010 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:21.312060 master-0 kubenswrapper[4010]: I0319 09:17:21.311953 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:17:21.312542 master-0 kubenswrapper[4010]: I0319 09:17:21.312518 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:21.312651 master-0 kubenswrapper[4010]: I0319 09:17:21.312524 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:21.312701 master-0 kubenswrapper[4010]: I0319 09:17:21.312678 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:21.313270 master-0 kubenswrapper[4010]: I0319 09:17:21.313249 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:21.313332 master-0 kubenswrapper[4010]: I0319 09:17:21.313275 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:21.313332 master-0 kubenswrapper[4010]: I0319 09:17:21.313285 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:21.313650 master-0 kubenswrapper[4010]: I0319 09:17:21.313602 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:21.313709 master-0 kubenswrapper[4010]: I0319 09:17:21.313674 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:21.313709 master-0 kubenswrapper[4010]: I0319 09:17:21.313686 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:21.317104 master-0 kubenswrapper[4010]: I0319 09:17:21.317079 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:17:21.708886 master-0 kubenswrapper[4010]: W0319 09:17:21.708826 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 09:17:21.708886 master-0 kubenswrapper[4010]: E0319 09:17:21.708874 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:22.064011 master-0 kubenswrapper[4010]: I0319 09:17:22.063808 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:22.246427 master-0 kubenswrapper[4010]: I0319 09:17:22.246298 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:22.318600 master-0 kubenswrapper[4010]: I0319 09:17:22.318316 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:22.318600 master-0 kubenswrapper[4010]: I0319 09:17:22.318349 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:22.320624 master-0 kubenswrapper[4010]: I0319 09:17:22.320588 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:22.320624 master-0 kubenswrapper[4010]: I0319 09:17:22.320636 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:22.320785 master-0 kubenswrapper[4010]: I0319 09:17:22.320654 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:22.321341 master-0 kubenswrapper[4010]: I0319 09:17:22.321270 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:22.321438 master-0 kubenswrapper[4010]: I0319 09:17:22.321362 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:22.321438 master-0 kubenswrapper[4010]: I0319 09:17:22.321378 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:23.046846 master-0 kubenswrapper[4010]: I0319 09:17:23.046769 4010 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:23.052233 master-0 kubenswrapper[4010]: I0319 09:17:23.052187 4010 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:23.063978 master-0 kubenswrapper[4010]: I0319 09:17:23.063935 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:23.320378 master-0 kubenswrapper[4010]: I0319 09:17:23.320254 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:23.320378 master-0 kubenswrapper[4010]: I0319 09:17:23.320317 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:23.321418 master-0 kubenswrapper[4010]: I0319 09:17:23.321383 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:23.321561 master-0 kubenswrapper[4010]: I0319 09:17:23.321427 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:23.321561 master-0 kubenswrapper[4010]: I0319 09:17:23.321383 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:23.321561 master-0 kubenswrapper[4010]: I0319 09:17:23.321446 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:23.321561 master-0 kubenswrapper[4010]: I0319 09:17:23.321501 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:23.321561 master-0 kubenswrapper[4010]: I0319 09:17:23.321519 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:23.977538 master-0 kubenswrapper[4010]: I0319 09:17:23.977436 4010 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:23.985496 master-0 kubenswrapper[4010]: I0319 09:17:23.984056 4010 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:24.065804 master-0 kubenswrapper[4010]: I0319 09:17:24.065705 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:24.322825 master-0 kubenswrapper[4010]: I0319 09:17:24.322760 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:24.323819 master-0 kubenswrapper[4010]: I0319 09:17:24.323779 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:24.323884 master-0 kubenswrapper[4010]: I0319 09:17:24.323832 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:24.323884 master-0 kubenswrapper[4010]: I0319 09:17:24.323845 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:25.062604 master-0 kubenswrapper[4010]: I0319 09:17:25.062501 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:25.324891 master-0 kubenswrapper[4010]: I0319 09:17:25.324661 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:25.325763 master-0 kubenswrapper[4010]: I0319 09:17:25.325699 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:25.325763 master-0 kubenswrapper[4010]: I0319 09:17:25.325761 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:25.325882 master-0 kubenswrapper[4010]: I0319 09:17:25.325772 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:26.067314 master-0 kubenswrapper[4010]: I0319 09:17:26.067258 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:27.066457 master-0 kubenswrapper[4010]: I0319 09:17:27.066326 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:27.710418 master-0 kubenswrapper[4010]: E0319 09:17:27.710282 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:17:27.920525 master-0 kubenswrapper[4010]: I0319 09:17:27.920439 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:27.921655 master-0 kubenswrapper[4010]: I0319 09:17:27.921601 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:27.921655 master-0 kubenswrapper[4010]: I0319 09:17:27.921634 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:27.921655 master-0 kubenswrapper[4010]: I0319 09:17:27.921644 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:27.921959 master-0 kubenswrapper[4010]: I0319 09:17:27.921690 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:27.930288 master-0 kubenswrapper[4010]: E0319 09:17:27.930192 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 09:17:28.068113 master-0 kubenswrapper[4010]: I0319 09:17:28.067904 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:29.065959 master-0 kubenswrapper[4010]: I0319 09:17:29.065898 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:30.065975 master-0 kubenswrapper[4010]: I0319 09:17:30.065850 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:30.164632 master-0 kubenswrapper[4010]: I0319 09:17:30.164565 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:30.164883 master-0 kubenswrapper[4010]: I0319 09:17:30.164736 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:30.165998 master-0 kubenswrapper[4010]: I0319 09:17:30.165935 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:30.165998 master-0 kubenswrapper[4010]: I0319 09:17:30.165980 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:30.165998 master-0 kubenswrapper[4010]: I0319 09:17:30.166000 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:30.170516 master-0 kubenswrapper[4010]: I0319 09:17:30.170396 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:17:30.336074 master-0 kubenswrapper[4010]: I0319 09:17:30.335884 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:30.337016 master-0 kubenswrapper[4010]: I0319 09:17:30.336936 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:30.337016 master-0 kubenswrapper[4010]: I0319 09:17:30.337001 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:30.337016 master-0 kubenswrapper[4010]: I0319 09:17:30.337021 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:31.064596 master-0 kubenswrapper[4010]: I0319 09:17:31.064519 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:31.198399 master-0 kubenswrapper[4010]: E0319 09:17:31.198327 4010 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:32.062345 master-0 kubenswrapper[4010]: I0319 09:17:32.062235 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:33.066746 master-0 kubenswrapper[4010]: I0319 09:17:33.066593 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:33.123511 master-0 kubenswrapper[4010]: W0319 09:17:33.123392 4010 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 09:17:33.123511 master-0 kubenswrapper[4010]: E0319 09:17:33.123458 4010 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:17:33.227073 master-0 kubenswrapper[4010]: I0319 09:17:33.226952 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:33.228678 master-0 kubenswrapper[4010]: I0319 09:17:33.228586 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:33.228678 master-0 kubenswrapper[4010]: I0319 09:17:33.228674 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:33.228827 master-0 kubenswrapper[4010]: I0319 09:17:33.228696 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:33.229312 master-0 kubenswrapper[4010]: I0319 09:17:33.229265 4010 scope.go:117] "RemoveContainer" containerID="c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11" Mar 19 09:17:33.229601 master-0 kubenswrapper[4010]: E0319 09:17:33.229546 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 09:17:33.235248 master-0 kubenswrapper[4010]: E0319 09:17:33.235064 4010 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e336886a1ec3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e336886a1ec3b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:17:07.258928187 +0000 UTC m=+6.784872794,LastTimestamp:2026-03-19 09:17:33.229460978 +0000 UTC m=+32.755405625,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:17:33.477752 master-0 kubenswrapper[4010]: I0319 09:17:33.477652 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:17:33.495795 master-0 kubenswrapper[4010]: I0319 09:17:33.495706 4010 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 09:17:34.063905 master-0 kubenswrapper[4010]: I0319 09:17:34.063852 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:34.716865 master-0 kubenswrapper[4010]: E0319 09:17:34.716757 4010 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:17:34.931374 master-0 kubenswrapper[4010]: I0319 09:17:34.931278 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:34.932951 master-0 kubenswrapper[4010]: I0319 09:17:34.932881 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:34.932951 master-0 kubenswrapper[4010]: I0319 09:17:34.932941 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:34.932951 master-0 kubenswrapper[4010]: I0319 09:17:34.932956 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:34.933144 master-0 kubenswrapper[4010]: I0319 09:17:34.933077 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:34.941287 master-0 kubenswrapper[4010]: E0319 09:17:34.941228 4010 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 09:17:35.063834 master-0 kubenswrapper[4010]: I0319 09:17:35.063646 4010 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:17:35.143933 master-0 kubenswrapper[4010]: I0319 09:17:35.143857 4010 csr.go:261] certificate signing request csr-vdsgb is approved, waiting to be issued Mar 19 09:17:35.600189 master-0 kubenswrapper[4010]: I0319 09:17:35.600073 4010 csr.go:257] certificate signing request csr-vdsgb is issued Mar 19 09:17:35.943224 master-0 kubenswrapper[4010]: I0319 09:17:35.943086 4010 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 09:17:36.067603 master-0 kubenswrapper[4010]: I0319 09:17:36.067549 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.086382 master-0 kubenswrapper[4010]: I0319 09:17:36.086331 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.148595 master-0 kubenswrapper[4010]: I0319 09:17:36.148530 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.419745 master-0 kubenswrapper[4010]: I0319 09:17:36.419531 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.419745 master-0 kubenswrapper[4010]: E0319 09:17:36.419599 4010 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:36.441665 master-0 kubenswrapper[4010]: I0319 09:17:36.441592 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.459166 master-0 kubenswrapper[4010]: I0319 09:17:36.459098 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.516019 master-0 kubenswrapper[4010]: I0319 09:17:36.515935 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.601889 master-0 kubenswrapper[4010]: I0319 09:17:36.601756 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 04:34:02.670068267 +0000 UTC Mar 19 09:17:36.601889 master-0 kubenswrapper[4010]: I0319 09:17:36.601845 4010 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 19h16m26.06822852s for next certificate rotation Mar 19 09:17:36.777460 master-0 kubenswrapper[4010]: I0319 09:17:36.777396 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.777460 master-0 kubenswrapper[4010]: E0319 09:17:36.777454 4010 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:36.880720 master-0 kubenswrapper[4010]: I0319 09:17:36.880650 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.898094 master-0 kubenswrapper[4010]: I0319 09:17:36.898000 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:36.957462 master-0 kubenswrapper[4010]: I0319 09:17:36.957399 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:37.219805 master-0 kubenswrapper[4010]: I0319 09:17:37.219725 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:37.219805 master-0 kubenswrapper[4010]: E0319 09:17:37.219774 4010 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:37.549949 master-0 kubenswrapper[4010]: I0319 09:17:37.549786 4010 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:17:37.788606 master-0 kubenswrapper[4010]: I0319 09:17:37.788556 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:37.805426 master-0 kubenswrapper[4010]: I0319 09:17:37.805303 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:37.864896 master-0 kubenswrapper[4010]: I0319 09:17:37.864759 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:38.062078 master-0 kubenswrapper[4010]: I0319 09:17:38.061871 4010 apiserver.go:52] "Watching apiserver" Mar 19 09:17:38.066032 master-0 kubenswrapper[4010]: I0319 09:17:38.065963 4010 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:17:38.066255 master-0 kubenswrapper[4010]: I0319 09:17:38.066160 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=[] Mar 19 09:17:38.121485 master-0 kubenswrapper[4010]: I0319 09:17:38.121371 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:38.121485 master-0 kubenswrapper[4010]: E0319 09:17:38.121430 4010 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 09:17:38.166913 master-0 kubenswrapper[4010]: I0319 09:17:38.166831 4010 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 09:17:41.199153 master-0 kubenswrapper[4010]: E0319 09:17:41.199051 4010 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:17:41.862652 master-0 kubenswrapper[4010]: E0319 09:17:41.862451 4010 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 19 09:17:41.942509 master-0 kubenswrapper[4010]: I0319 09:17:41.942399 4010 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:17:41.944082 master-0 kubenswrapper[4010]: I0319 09:17:41.944031 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:17:41.944168 master-0 kubenswrapper[4010]: I0319 09:17:41.944109 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:17:41.944168 master-0 kubenswrapper[4010]: I0319 09:17:41.944135 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:17:41.944439 master-0 kubenswrapper[4010]: I0319 09:17:41.944248 4010 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:17:42.058013 master-0 kubenswrapper[4010]: I0319 09:17:42.057906 4010 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 09:17:42.387738 master-0 kubenswrapper[4010]: I0319 09:17:42.387649 4010 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 09:17:42.387738 master-0 kubenswrapper[4010]: E0319 09:17:42.387725 4010 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 09:17:42.986371 master-0 kubenswrapper[4010]: I0319 09:17:42.986298 4010 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:17:43.090271 master-0 kubenswrapper[4010]: I0319 09:17:43.090149 4010 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 09:17:43.130003 master-0 kubenswrapper[4010]: I0319 09:17:43.129933 4010 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 09:17:43.270539 master-0 kubenswrapper[4010]: I0319 09:17:43.270341 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-gn85g"] Mar 19 09:17:43.270681 master-0 kubenswrapper[4010]: I0319 09:17:43.270559 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.272213 master-0 kubenswrapper[4010]: I0319 09:17:43.272180 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 19 09:17:43.272390 master-0 kubenswrapper[4010]: I0319 09:17:43.272365 4010 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 19 09:17:43.272835 master-0 kubenswrapper[4010]: I0319 09:17:43.272805 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 19 09:17:43.273047 master-0 kubenswrapper[4010]: I0319 09:17:43.272991 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 19 09:17:43.355172 master-0 kubenswrapper[4010]: I0319 09:17:43.355081 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-var-run-resolv-conf\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.355172 master-0 kubenswrapper[4010]: I0319 09:17:43.355163 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-ca-bundle\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.355396 master-0 kubenswrapper[4010]: I0319 09:17:43.355211 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twvgm\" (UniqueName: \"kubernetes.io/projected/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-kube-api-access-twvgm\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.355396 master-0 kubenswrapper[4010]: I0319 09:17:43.355252 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-resolv-conf\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.355396 master-0 kubenswrapper[4010]: I0319 09:17:43.355281 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-sno-bootstrap-files\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.456168 master-0 kubenswrapper[4010]: I0319 09:17:43.456056 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-resolv-conf\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.456168 master-0 kubenswrapper[4010]: I0319 09:17:43.456115 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-sno-bootstrap-files\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.456168 master-0 kubenswrapper[4010]: I0319 09:17:43.456146 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-var-run-resolv-conf\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.457100 master-0 kubenswrapper[4010]: I0319 09:17:43.456201 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-resolv-conf\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.457100 master-0 kubenswrapper[4010]: I0319 09:17:43.456248 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-ca-bundle\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.457100 master-0 kubenswrapper[4010]: I0319 09:17:43.456267 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twvgm\" (UniqueName: \"kubernetes.io/projected/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-kube-api-access-twvgm\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.457100 master-0 kubenswrapper[4010]: I0319 09:17:43.456438 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-var-run-resolv-conf\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.457100 master-0 kubenswrapper[4010]: I0319 09:17:43.456492 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-sno-bootstrap-files\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.457100 master-0 kubenswrapper[4010]: I0319 09:17:43.456671 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-ca-bundle\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.481148 master-0 kubenswrapper[4010]: I0319 09:17:43.481080 4010 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:17:43.488111 master-0 kubenswrapper[4010]: I0319 09:17:43.488061 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twvgm\" (UniqueName: \"kubernetes.io/projected/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-kube-api-access-twvgm\") pod \"assisted-installer-controller-gn85g\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.595073 master-0 kubenswrapper[4010]: I0319 09:17:43.594914 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:43.833627 master-0 kubenswrapper[4010]: I0319 09:17:43.833551 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-7bd846bfc4-jxvxl"] Mar 19 09:17:43.834047 master-0 kubenswrapper[4010]: I0319 09:17:43.833997 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:43.837091 master-0 kubenswrapper[4010]: I0319 09:17:43.837044 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:17:43.837973 master-0 kubenswrapper[4010]: I0319 09:17:43.837930 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:17:43.838359 master-0 kubenswrapper[4010]: I0319 09:17:43.838309 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:17:43.959715 master-0 kubenswrapper[4010]: I0319 09:17:43.959614 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:43.959715 master-0 kubenswrapper[4010]: I0319 09:17:43.959675 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:43.959715 master-0 kubenswrapper[4010]: I0319 09:17:43.959708 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.060310 master-0 kubenswrapper[4010]: I0319 09:17:44.060221 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.060310 master-0 kubenswrapper[4010]: I0319 09:17:44.060315 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.060636 master-0 kubenswrapper[4010]: I0319 09:17:44.060351 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.061003 master-0 kubenswrapper[4010]: I0319 09:17:44.060878 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.065326 master-0 kubenswrapper[4010]: I0319 09:17:44.065274 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.084498 master-0 kubenswrapper[4010]: I0319 09:17:44.084438 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.152059 master-0 kubenswrapper[4010]: I0319 09:17:44.151891 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:17:44.166937 master-0 kubenswrapper[4010]: W0319 09:17:44.166881 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a6be5d9_c0d3_49c3_bb9a_4c8bec66b823.slice/crio-53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204 WatchSource:0}: Error finding container 53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204: Status 404 returned error can't find the container with id 53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204 Mar 19 09:17:44.230905 master-0 kubenswrapper[4010]: I0319 09:17:44.230259 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2"] Mar 19 09:17:44.231306 master-0 kubenswrapper[4010]: I0319 09:17:44.231282 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.235748 master-0 kubenswrapper[4010]: I0319 09:17:44.235224 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:17:44.235748 master-0 kubenswrapper[4010]: I0319 09:17:44.235278 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:17:44.237374 master-0 kubenswrapper[4010]: I0319 09:17:44.237335 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:17:44.363420 master-0 kubenswrapper[4010]: I0319 09:17:44.363353 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.363420 master-0 kubenswrapper[4010]: I0319 09:17:44.363415 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.363685 master-0 kubenswrapper[4010]: I0319 09:17:44.363550 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.363685 master-0 kubenswrapper[4010]: I0319 09:17:44.363650 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.363685 master-0 kubenswrapper[4010]: I0319 09:17:44.363676 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.374284 master-0 kubenswrapper[4010]: I0319 09:17:44.374221 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" event={"ID":"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823","Type":"ContainerStarted","Data":"53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204"} Mar 19 09:17:44.375729 master-0 kubenswrapper[4010]: I0319 09:17:44.375691 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-gn85g" event={"ID":"9039b9d3-27c2-4c42-ae8b-28e40570b3c2","Type":"ContainerStarted","Data":"9589bbab032e262b4d7aedeb656ab180a0c26f2d3e71118ea25c48ac0d07f6bd"} Mar 19 09:17:44.464301 master-0 kubenswrapper[4010]: I0319 09:17:44.464185 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.464301 master-0 kubenswrapper[4010]: I0319 09:17:44.464292 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.464978 master-0 kubenswrapper[4010]: I0319 09:17:44.464338 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.464978 master-0 kubenswrapper[4010]: I0319 09:17:44.464374 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.464978 master-0 kubenswrapper[4010]: I0319 09:17:44.464410 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.464978 master-0 kubenswrapper[4010]: E0319 09:17:44.464627 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:44.464978 master-0 kubenswrapper[4010]: E0319 09:17:44.464743 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:17:44.964702769 +0000 UTC m=+44.490647416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:44.464978 master-0 kubenswrapper[4010]: I0319 09:17:44.464737 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.465246 master-0 kubenswrapper[4010]: I0319 09:17:44.465048 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.466740 master-0 kubenswrapper[4010]: I0319 09:17:44.466665 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.486653 master-0 kubenswrapper[4010]: I0319 09:17:44.486601 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.968278 master-0 kubenswrapper[4010]: I0319 09:17:44.968228 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:44.968634 master-0 kubenswrapper[4010]: E0319 09:17:44.968571 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:44.968750 master-0 kubenswrapper[4010]: E0319 09:17:44.968728 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:17:45.968693507 +0000 UTC m=+45.494638114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:45.396738 master-0 kubenswrapper[4010]: I0319 09:17:45.396502 4010 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:17:45.717777 master-0 kubenswrapper[4010]: I0319 09:17:45.717712 4010 csr.go:261] certificate signing request csr-tjw7q is approved, waiting to be issued Mar 19 09:17:45.863285 master-0 kubenswrapper[4010]: I0319 09:17:45.863218 4010 csr.go:257] certificate signing request csr-tjw7q is issued Mar 19 09:17:45.975657 master-0 kubenswrapper[4010]: I0319 09:17:45.975458 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:45.975824 master-0 kubenswrapper[4010]: E0319 09:17:45.975678 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:45.975824 master-0 kubenswrapper[4010]: E0319 09:17:45.975762 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:17:47.975740591 +0000 UTC m=+47.501685208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:46.325020 master-0 kubenswrapper[4010]: I0319 09:17:46.324848 4010 scope.go:117] "RemoveContainer" containerID="c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11" Mar 19 09:17:46.325276 master-0 kubenswrapper[4010]: I0319 09:17:46.325209 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 09:17:46.864493 master-0 kubenswrapper[4010]: I0319 09:17:46.864427 4010 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 03:21:47.002679501 +0000 UTC Mar 19 09:17:46.864493 master-0 kubenswrapper[4010]: I0319 09:17:46.864490 4010 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h4m0.138192637s for next certificate rotation Mar 19 09:17:47.384995 master-0 kubenswrapper[4010]: I0319 09:17:47.384957 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:17:47.385339 master-0 kubenswrapper[4010]: I0319 09:17:47.385315 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"e046e1ab5ed34b841248a951c60543dfca2a668c2cdbbcdc17996eec0b9a0bfb"} Mar 19 09:17:47.865707 master-0 kubenswrapper[4010]: I0319 09:17:47.865628 4010 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 04:56:12.088454479 +0000 UTC Mar 19 09:17:47.865707 master-0 kubenswrapper[4010]: I0319 09:17:47.865680 4010 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h38m24.222777858s for next certificate rotation Mar 19 09:17:47.986519 master-0 kubenswrapper[4010]: I0319 09:17:47.986076 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:47.986519 master-0 kubenswrapper[4010]: E0319 09:17:47.986182 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:47.986519 master-0 kubenswrapper[4010]: E0319 09:17:47.986231 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:17:51.986217794 +0000 UTC m=+51.512162391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:50.392487 master-0 kubenswrapper[4010]: I0319 09:17:50.392412 4010 generic.go:334] "Generic (PLEG): container finished" podID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerID="ddf97e1b992b687ae1658f8b5cc4c1c01ae45509b7aaa2768e80614c358636c9" exitCode=0 Mar 19 09:17:50.392942 master-0 kubenswrapper[4010]: I0319 09:17:50.392487 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-gn85g" event={"ID":"9039b9d3-27c2-4c42-ae8b-28e40570b3c2","Type":"ContainerDied","Data":"ddf97e1b992b687ae1658f8b5cc4c1c01ae45509b7aaa2768e80614c358636c9"} Mar 19 09:17:50.394167 master-0 kubenswrapper[4010]: I0319 09:17:50.394114 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" event={"ID":"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823","Type":"ContainerStarted","Data":"4903db04251051a54ad7e347003826304ccc0327af5e8e5393199af2a3df5cfe"} Mar 19 09:17:50.408980 master-0 kubenswrapper[4010]: I0319 09:17:50.408906 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=4.408891068 podStartE2EDuration="4.408891068s" podCreationTimestamp="2026-03-19 09:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:17:47.398846376 +0000 UTC m=+46.924790983" watchObservedRunningTime="2026-03-19 09:17:50.408891068 +0000 UTC m=+49.934835675" Mar 19 09:17:51.513496 master-0 kubenswrapper[4010]: I0319 09:17:51.513431 4010 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:51.524357 master-0 kubenswrapper[4010]: I0319 09:17:51.524291 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" podStartSLOduration=3.151799329 podStartE2EDuration="8.524270879s" podCreationTimestamp="2026-03-19 09:17:43 +0000 UTC" firstStartedPulling="2026-03-19 09:17:44.171019834 +0000 UTC m=+43.696964441" lastFinishedPulling="2026-03-19 09:17:49.543491384 +0000 UTC m=+49.069435991" observedRunningTime="2026-03-19 09:17:50.42469132 +0000 UTC m=+49.950635927" watchObservedRunningTime="2026-03-19 09:17:51.524270879 +0000 UTC m=+51.050215486" Mar 19 09:17:51.612353 master-0 kubenswrapper[4010]: I0319 09:17:51.612307 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-var-run-resolv-conf\") pod \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612361 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twvgm\" (UniqueName: \"kubernetes.io/projected/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-kube-api-access-twvgm\") pod \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612393 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-ca-bundle\") pod \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612415 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-sno-bootstrap-files\") pod \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612438 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-resolv-conf\") pod \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\" (UID: \"9039b9d3-27c2-4c42-ae8b-28e40570b3c2\") " Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612449 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "9039b9d3-27c2-4c42-ae8b-28e40570b3c2" (UID: "9039b9d3-27c2-4c42-ae8b-28e40570b3c2"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612499 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "9039b9d3-27c2-4c42-ae8b-28e40570b3c2" (UID: "9039b9d3-27c2-4c42-ae8b-28e40570b3c2"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612549 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "9039b9d3-27c2-4c42-ae8b-28e40570b3c2" (UID: "9039b9d3-27c2-4c42-ae8b-28e40570b3c2"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:17:51.612569 master-0 kubenswrapper[4010]: I0319 09:17:51.612553 4010 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:17:51.612797 master-0 kubenswrapper[4010]: I0319 09:17:51.612594 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "9039b9d3-27c2-4c42-ae8b-28e40570b3c2" (UID: "9039b9d3-27c2-4c42-ae8b-28e40570b3c2"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:17:51.612797 master-0 kubenswrapper[4010]: I0319 09:17:51.612598 4010 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:17:51.616298 master-0 kubenswrapper[4010]: I0319 09:17:51.616247 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-kube-api-access-twvgm" (OuterVolumeSpecName: "kube-api-access-twvgm") pod "9039b9d3-27c2-4c42-ae8b-28e40570b3c2" (UID: "9039b9d3-27c2-4c42-ae8b-28e40570b3c2"). InnerVolumeSpecName "kube-api-access-twvgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:17:51.713696 master-0 kubenswrapper[4010]: I0319 09:17:51.713638 4010 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 19 09:17:51.713696 master-0 kubenswrapper[4010]: I0319 09:17:51.713675 4010 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:17:51.713696 master-0 kubenswrapper[4010]: I0319 09:17:51.713689 4010 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twvgm\" (UniqueName: \"kubernetes.io/projected/9039b9d3-27c2-4c42-ae8b-28e40570b3c2-kube-api-access-twvgm\") on node \"master-0\" DevicePath \"\"" Mar 19 09:17:52.016198 master-0 kubenswrapper[4010]: I0319 09:17:52.016117 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:17:52.016422 master-0 kubenswrapper[4010]: E0319 09:17:52.016257 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:52.016422 master-0 kubenswrapper[4010]: E0319 09:17:52.016321 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:00.016303487 +0000 UTC m=+59.542248094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:17:52.227331 master-0 kubenswrapper[4010]: I0319 09:17:52.227287 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-cnb74"] Mar 19 09:17:52.227581 master-0 kubenswrapper[4010]: E0319 09:17:52.227370 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:17:52.227581 master-0 kubenswrapper[4010]: I0319 09:17:52.227388 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:17:52.227581 master-0 kubenswrapper[4010]: I0319 09:17:52.227418 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:17:52.227711 master-0 kubenswrapper[4010]: I0319 09:17:52.227626 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:52.318030 master-0 kubenswrapper[4010]: I0319 09:17:52.317911 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbmzg\" (UniqueName: \"kubernetes.io/projected/c252745a-f6dc-4e94-a4b2-fbf21c9602ee-kube-api-access-vbmzg\") pod \"mtu-prober-cnb74\" (UID: \"c252745a-f6dc-4e94-a4b2-fbf21c9602ee\") " pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:52.419131 master-0 kubenswrapper[4010]: I0319 09:17:52.419065 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbmzg\" (UniqueName: \"kubernetes.io/projected/c252745a-f6dc-4e94-a4b2-fbf21c9602ee-kube-api-access-vbmzg\") pod \"mtu-prober-cnb74\" (UID: \"c252745a-f6dc-4e94-a4b2-fbf21c9602ee\") " pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:52.432934 master-0 kubenswrapper[4010]: I0319 09:17:52.432878 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbmzg\" (UniqueName: \"kubernetes.io/projected/c252745a-f6dc-4e94-a4b2-fbf21c9602ee-kube-api-access-vbmzg\") pod \"mtu-prober-cnb74\" (UID: \"c252745a-f6dc-4e94-a4b2-fbf21c9602ee\") " pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:52.500968 master-0 kubenswrapper[4010]: I0319 09:17:52.500708 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-gn85g" event={"ID":"9039b9d3-27c2-4c42-ae8b-28e40570b3c2","Type":"ContainerDied","Data":"9589bbab032e262b4d7aedeb656ab180a0c26f2d3e71118ea25c48ac0d07f6bd"} Mar 19 09:17:52.501186 master-0 kubenswrapper[4010]: I0319 09:17:52.501171 4010 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9589bbab032e262b4d7aedeb656ab180a0c26f2d3e71118ea25c48ac0d07f6bd" Mar 19 09:17:52.501274 master-0 kubenswrapper[4010]: I0319 09:17:52.500852 4010 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:17:52.537433 master-0 kubenswrapper[4010]: I0319 09:17:52.537312 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:52.551110 master-0 kubenswrapper[4010]: W0319 09:17:52.551047 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc252745a_f6dc_4e94_a4b2_fbf21c9602ee.slice/crio-180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26 WatchSource:0}: Error finding container 180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26: Status 404 returned error can't find the container with id 180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26 Mar 19 09:17:53.504745 master-0 kubenswrapper[4010]: I0319 09:17:53.504692 4010 generic.go:334] "Generic (PLEG): container finished" podID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerID="9b28c300e3439abe307f50e88ba8ce2d925b14966bafd61f93ba6a56066cd1f7" exitCode=0 Mar 19 09:17:53.504942 master-0 kubenswrapper[4010]: I0319 09:17:53.504784 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-cnb74" event={"ID":"c252745a-f6dc-4e94-a4b2-fbf21c9602ee","Type":"ContainerDied","Data":"9b28c300e3439abe307f50e88ba8ce2d925b14966bafd61f93ba6a56066cd1f7"} Mar 19 09:17:53.504942 master-0 kubenswrapper[4010]: I0319 09:17:53.504840 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-cnb74" event={"ID":"c252745a-f6dc-4e94-a4b2-fbf21c9602ee","Type":"ContainerStarted","Data":"180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26"} Mar 19 09:17:54.518159 master-0 kubenswrapper[4010]: I0319 09:17:54.518124 4010 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:54.634081 master-0 kubenswrapper[4010]: I0319 09:17:54.633963 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbmzg\" (UniqueName: \"kubernetes.io/projected/c252745a-f6dc-4e94-a4b2-fbf21c9602ee-kube-api-access-vbmzg\") pod \"c252745a-f6dc-4e94-a4b2-fbf21c9602ee\" (UID: \"c252745a-f6dc-4e94-a4b2-fbf21c9602ee\") " Mar 19 09:17:54.637018 master-0 kubenswrapper[4010]: I0319 09:17:54.636971 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c252745a-f6dc-4e94-a4b2-fbf21c9602ee-kube-api-access-vbmzg" (OuterVolumeSpecName: "kube-api-access-vbmzg") pod "c252745a-f6dc-4e94-a4b2-fbf21c9602ee" (UID: "c252745a-f6dc-4e94-a4b2-fbf21c9602ee"). InnerVolumeSpecName "kube-api-access-vbmzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:17:54.734999 master-0 kubenswrapper[4010]: I0319 09:17:54.734949 4010 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbmzg\" (UniqueName: \"kubernetes.io/projected/c252745a-f6dc-4e94-a4b2-fbf21c9602ee-kube-api-access-vbmzg\") on node \"master-0\" DevicePath \"\"" Mar 19 09:17:55.511967 master-0 kubenswrapper[4010]: I0319 09:17:55.511851 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-cnb74" event={"ID":"c252745a-f6dc-4e94-a4b2-fbf21c9602ee","Type":"ContainerDied","Data":"180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26"} Mar 19 09:17:55.511967 master-0 kubenswrapper[4010]: I0319 09:17:55.511900 4010 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26" Mar 19 09:17:55.511967 master-0 kubenswrapper[4010]: I0319 09:17:55.511919 4010 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-cnb74" Mar 19 09:17:57.139364 master-0 kubenswrapper[4010]: I0319 09:17:57.139057 4010 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-cnb74"] Mar 19 09:17:57.143196 master-0 kubenswrapper[4010]: I0319 09:17:57.143158 4010 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-cnb74"] Mar 19 09:17:57.231064 master-0 kubenswrapper[4010]: I0319 09:17:57.230995 4010 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" path="/var/lib/kubelet/pods/c252745a-f6dc-4e94-a4b2-fbf21c9602ee/volumes" Mar 19 09:18:00.074225 master-0 kubenswrapper[4010]: I0319 09:18:00.074160 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:18:00.075052 master-0 kubenswrapper[4010]: E0319 09:18:00.074337 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:00.075052 master-0 kubenswrapper[4010]: E0319 09:18:00.074413 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:16.074391952 +0000 UTC m=+75.600336569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:02.028967 master-0 kubenswrapper[4010]: I0319 09:18:02.028666 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8svct"] Mar 19 09:18:02.028967 master-0 kubenswrapper[4010]: E0319 09:18:02.028778 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:18:02.028967 master-0 kubenswrapper[4010]: I0319 09:18:02.028799 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:18:02.028967 master-0 kubenswrapper[4010]: I0319 09:18:02.028832 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:18:02.029838 master-0 kubenswrapper[4010]: I0319 09:18:02.029055 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8svct" Mar 19 09:18:02.034044 master-0 kubenswrapper[4010]: I0319 09:18:02.032859 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:18:02.034044 master-0 kubenswrapper[4010]: I0319 09:18:02.033200 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:18:02.034044 master-0 kubenswrapper[4010]: I0319 09:18:02.033496 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:18:02.034296 master-0 kubenswrapper[4010]: I0319 09:18:02.034233 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:18:02.087205 master-0 kubenswrapper[4010]: I0319 09:18:02.087121 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087205 master-0 kubenswrapper[4010]: I0319 09:18:02.087189 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087205 master-0 kubenswrapper[4010]: I0319 09:18:02.087224 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087260 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087291 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087324 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087368 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087403 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087438 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087499 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087534 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087566 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087594 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087626 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087656 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.087727 master-0 kubenswrapper[4010]: I0319 09:18:02.087710 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.088365 master-0 kubenswrapper[4010]: I0319 09:18:02.087751 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189151 master-0 kubenswrapper[4010]: I0319 09:18:02.189038 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189151 master-0 kubenswrapper[4010]: I0319 09:18:02.189131 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189568 master-0 kubenswrapper[4010]: I0319 09:18:02.189255 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189568 master-0 kubenswrapper[4010]: I0319 09:18:02.189374 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189568 master-0 kubenswrapper[4010]: I0319 09:18:02.189542 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189686 master-0 kubenswrapper[4010]: I0319 09:18:02.189663 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189730 master-0 kubenswrapper[4010]: I0319 09:18:02.189714 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189779 master-0 kubenswrapper[4010]: I0319 09:18:02.189749 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189823 master-0 kubenswrapper[4010]: I0319 09:18:02.189794 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189868 master-0 kubenswrapper[4010]: I0319 09:18:02.189826 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189912 master-0 kubenswrapper[4010]: I0319 09:18:02.189867 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.189955 master-0 kubenswrapper[4010]: I0319 09:18:02.189914 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190002 master-0 kubenswrapper[4010]: I0319 09:18:02.189954 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190045 master-0 kubenswrapper[4010]: I0319 09:18:02.189990 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190045 master-0 kubenswrapper[4010]: I0319 09:18:02.190015 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190045 master-0 kubenswrapper[4010]: I0319 09:18:02.190000 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190174 master-0 kubenswrapper[4010]: I0319 09:18:02.190072 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190174 master-0 kubenswrapper[4010]: I0319 09:18:02.190026 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190174 master-0 kubenswrapper[4010]: I0319 09:18:02.190035 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190174 master-0 kubenswrapper[4010]: I0319 09:18:02.190148 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190326 master-0 kubenswrapper[4010]: I0319 09:18:02.190206 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190326 master-0 kubenswrapper[4010]: I0319 09:18:02.190245 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190326 master-0 kubenswrapper[4010]: I0319 09:18:02.190275 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190494 master-0 kubenswrapper[4010]: I0319 09:18:02.190351 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190494 master-0 kubenswrapper[4010]: I0319 09:18:02.190373 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190494 master-0 kubenswrapper[4010]: I0319 09:18:02.190413 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190620 master-0 kubenswrapper[4010]: I0319 09:18:02.190527 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190620 master-0 kubenswrapper[4010]: I0319 09:18:02.190534 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190620 master-0 kubenswrapper[4010]: I0319 09:18:02.190563 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190620 master-0 kubenswrapper[4010]: I0319 09:18:02.190596 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.190795 master-0 kubenswrapper[4010]: I0319 09:18:02.190669 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.191210 master-0 kubenswrapper[4010]: I0319 09:18:02.191149 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.191771 master-0 kubenswrapper[4010]: I0319 09:18:02.191713 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.219454 master-0 kubenswrapper[4010]: I0319 09:18:02.211167 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:18:02.222239 master-0 kubenswrapper[4010]: I0319 09:18:02.222166 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tjzdb"] Mar 19 09:18:02.222813 master-0 kubenswrapper[4010]: I0319 09:18:02.222780 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.225215 master-0 kubenswrapper[4010]: I0319 09:18:02.225177 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:18:02.225215 master-0 kubenswrapper[4010]: I0319 09:18:02.225203 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:18:02.291727 master-0 kubenswrapper[4010]: I0319 09:18:02.291598 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.291727 master-0 kubenswrapper[4010]: I0319 09:18:02.291647 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.291727 master-0 kubenswrapper[4010]: I0319 09:18:02.291675 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.291727 master-0 kubenswrapper[4010]: I0319 09:18:02.291699 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.291727 master-0 kubenswrapper[4010]: I0319 09:18:02.291722 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.292049 master-0 kubenswrapper[4010]: I0319 09:18:02.291793 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.292049 master-0 kubenswrapper[4010]: I0319 09:18:02.291872 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.292049 master-0 kubenswrapper[4010]: I0319 09:18:02.291896 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.345713 master-0 kubenswrapper[4010]: I0319 09:18:02.345631 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8svct" Mar 19 09:18:02.356792 master-0 kubenswrapper[4010]: W0319 09:18:02.356737 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872e5f8c_b014_4283_a4d2_0e2cfd29e192.slice/crio-98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687 WatchSource:0}: Error finding container 98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687: Status 404 returned error can't find the container with id 98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687 Mar 19 09:18:02.393251 master-0 kubenswrapper[4010]: I0319 09:18:02.393147 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.393251 master-0 kubenswrapper[4010]: I0319 09:18:02.393196 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.393251 master-0 kubenswrapper[4010]: I0319 09:18:02.393242 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.393887 master-0 kubenswrapper[4010]: I0319 09:18:02.393426 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.393887 master-0 kubenswrapper[4010]: I0319 09:18:02.393734 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.393887 master-0 kubenswrapper[4010]: I0319 09:18:02.393643 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394303 master-0 kubenswrapper[4010]: I0319 09:18:02.394222 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394590 master-0 kubenswrapper[4010]: I0319 09:18:02.394507 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394697 master-0 kubenswrapper[4010]: I0319 09:18:02.394591 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394697 master-0 kubenswrapper[4010]: I0319 09:18:02.394640 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394697 master-0 kubenswrapper[4010]: I0319 09:18:02.394646 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394898 master-0 kubenswrapper[4010]: I0319 09:18:02.394703 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394898 master-0 kubenswrapper[4010]: I0319 09:18:02.394808 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.394898 master-0 kubenswrapper[4010]: I0319 09:18:02.394848 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.395790 master-0 kubenswrapper[4010]: I0319 09:18:02.395746 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.417357 master-0 kubenswrapper[4010]: I0319 09:18:02.417301 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.530124 master-0 kubenswrapper[4010]: I0319 09:18:02.529820 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svct" event={"ID":"872e5f8c-b014-4283-a4d2-0e2cfd29e192","Type":"ContainerStarted","Data":"98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687"} Mar 19 09:18:02.542956 master-0 kubenswrapper[4010]: I0319 09:18:02.542827 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:18:02.555014 master-0 kubenswrapper[4010]: W0319 09:18:02.554943 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ebcecb_c210_434e_83a1_825265e206f1.slice/crio-68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7 WatchSource:0}: Error finding container 68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7: Status 404 returned error can't find the container with id 68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7 Mar 19 09:18:03.016569 master-0 kubenswrapper[4010]: I0319 09:18:03.015958 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-p76jz"] Mar 19 09:18:03.016802 master-0 kubenswrapper[4010]: I0319 09:18:03.016653 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.016802 master-0 kubenswrapper[4010]: E0319 09:18:03.016741 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:03.101489 master-0 kubenswrapper[4010]: I0319 09:18:03.101288 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.102211 master-0 kubenswrapper[4010]: I0319 09:18:03.101539 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.202530 master-0 kubenswrapper[4010]: I0319 09:18:03.202240 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.202530 master-0 kubenswrapper[4010]: I0319 09:18:03.202278 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.202745 master-0 kubenswrapper[4010]: E0319 09:18:03.202613 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:03.202745 master-0 kubenswrapper[4010]: E0319 09:18:03.202651 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:18:03.702638923 +0000 UTC m=+63.228583530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:03.220572 master-0 kubenswrapper[4010]: I0319 09:18:03.220454 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.533453 master-0 kubenswrapper[4010]: I0319 09:18:03.533386 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerStarted","Data":"68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7"} Mar 19 09:18:03.705981 master-0 kubenswrapper[4010]: I0319 09:18:03.705907 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:03.706205 master-0 kubenswrapper[4010]: E0319 09:18:03.706088 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:03.706205 master-0 kubenswrapper[4010]: E0319 09:18:03.706154 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:18:04.70613729 +0000 UTC m=+64.232081897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:04.226890 master-0 kubenswrapper[4010]: I0319 09:18:04.226827 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:04.227495 master-0 kubenswrapper[4010]: E0319 09:18:04.226990 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:04.713935 master-0 kubenswrapper[4010]: I0319 09:18:04.713858 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:04.714159 master-0 kubenswrapper[4010]: E0319 09:18:04.713987 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:04.714159 master-0 kubenswrapper[4010]: E0319 09:18:04.714038 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:18:06.714025533 +0000 UTC m=+66.239970140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:05.539142 master-0 kubenswrapper[4010]: I0319 09:18:05.539052 4010 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="a2f44163a580069fe9b4a06584e3e0baeea817a9f7b28d2b1b8dc2d50f42ba8a" exitCode=0 Mar 19 09:18:05.539142 master-0 kubenswrapper[4010]: I0319 09:18:05.539109 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerDied","Data":"a2f44163a580069fe9b4a06584e3e0baeea817a9f7b28d2b1b8dc2d50f42ba8a"} Mar 19 09:18:06.226220 master-0 kubenswrapper[4010]: I0319 09:18:06.226164 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:06.226372 master-0 kubenswrapper[4010]: E0319 09:18:06.226283 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:06.729081 master-0 kubenswrapper[4010]: I0319 09:18:06.729016 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:06.729737 master-0 kubenswrapper[4010]: E0319 09:18:06.729198 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:06.729737 master-0 kubenswrapper[4010]: E0319 09:18:06.729285 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:18:10.729266429 +0000 UTC m=+70.255211036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:08.226985 master-0 kubenswrapper[4010]: I0319 09:18:08.226929 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:08.227490 master-0 kubenswrapper[4010]: E0319 09:18:08.227091 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:10.226445 master-0 kubenswrapper[4010]: I0319 09:18:10.226390 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:10.227252 master-0 kubenswrapper[4010]: E0319 09:18:10.226509 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:10.765933 master-0 kubenswrapper[4010]: I0319 09:18:10.765857 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:10.766207 master-0 kubenswrapper[4010]: E0319 09:18:10.765985 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:10.766207 master-0 kubenswrapper[4010]: E0319 09:18:10.766036 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:18:18.766022113 +0000 UTC m=+78.291966720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:11.551516 master-0 kubenswrapper[4010]: I0319 09:18:11.551230 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerStarted","Data":"7335f4e870393336ecca59a320d7b43e9c33ca895a7a0816d7e753f6c020f7af"} Mar 19 09:18:12.091357 master-0 kubenswrapper[4010]: I0319 09:18:12.091315 4010 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:18:12.227024 master-0 kubenswrapper[4010]: I0319 09:18:12.226920 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:12.227416 master-0 kubenswrapper[4010]: E0319 09:18:12.227157 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:14.226416 master-0 kubenswrapper[4010]: I0319 09:18:14.226353 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:14.227200 master-0 kubenswrapper[4010]: E0319 09:18:14.226564 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:14.412947 master-0 kubenswrapper[4010]: I0319 09:18:14.412526 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv"] Mar 19 09:18:14.413120 master-0 kubenswrapper[4010]: I0319 09:18:14.412959 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.415638 master-0 kubenswrapper[4010]: I0319 09:18:14.414934 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:18:14.415638 master-0 kubenswrapper[4010]: I0319 09:18:14.415225 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:18:14.415638 master-0 kubenswrapper[4010]: I0319 09:18:14.415359 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:18:14.415638 master-0 kubenswrapper[4010]: I0319 09:18:14.415521 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:18:14.417646 master-0 kubenswrapper[4010]: I0319 09:18:14.417567 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:18:14.494073 master-0 kubenswrapper[4010]: I0319 09:18:14.493958 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.494073 master-0 kubenswrapper[4010]: I0319 09:18:14.494039 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.494258 master-0 kubenswrapper[4010]: I0319 09:18:14.494076 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.494258 master-0 kubenswrapper[4010]: I0319 09:18:14.494100 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.595172 master-0 kubenswrapper[4010]: I0319 09:18:14.595127 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.595172 master-0 kubenswrapper[4010]: I0319 09:18:14.595172 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.595387 master-0 kubenswrapper[4010]: I0319 09:18:14.595210 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.595387 master-0 kubenswrapper[4010]: I0319 09:18:14.595365 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.596759 master-0 kubenswrapper[4010]: I0319 09:18:14.596735 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.597345 master-0 kubenswrapper[4010]: I0319 09:18:14.597088 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.599586 master-0 kubenswrapper[4010]: I0319 09:18:14.599540 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.615939 master-0 kubenswrapper[4010]: I0319 09:18:14.615859 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.623510 master-0 kubenswrapper[4010]: I0319 09:18:14.623457 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-22clf"] Mar 19 09:18:14.624121 master-0 kubenswrapper[4010]: I0319 09:18:14.624095 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.625538 master-0 kubenswrapper[4010]: I0319 09:18:14.625495 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:18:14.625976 master-0 kubenswrapper[4010]: I0319 09:18:14.625963 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:18:14.727566 master-0 kubenswrapper[4010]: I0319 09:18:14.727515 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:18:14.737366 master-0 kubenswrapper[4010]: W0319 09:18:14.737320 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2f6f98_3bbe_42cc_81c2_f498b17e4ef5.slice/crio-ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43 WatchSource:0}: Error finding container ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43: Status 404 returned error can't find the container with id ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43 Mar 19 09:18:14.797846 master-0 kubenswrapper[4010]: I0319 09:18:14.797582 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-etc-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.797846 master-0 kubenswrapper[4010]: I0319 09:18:14.797632 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.797846 master-0 kubenswrapper[4010]: I0319 09:18:14.797649 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-config\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.797846 master-0 kubenswrapper[4010]: I0319 09:18:14.797780 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-node-log\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.797893 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-systemd-units\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.797929 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-env-overrides\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.797965 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-ovn\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798008 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-kubelet\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798093 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqfx\" (UniqueName: \"kubernetes.io/projected/57ac3af0-7d16-4715-9afa-6e98a2777e6e-kube-api-access-gfqfx\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798140 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-netd\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798166 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-slash\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798183 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-var-lib-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798198 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-log-socket\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798203 master-0 kubenswrapper[4010]: I0319 09:18:14.798213 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-bin\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798629 master-0 kubenswrapper[4010]: I0319 09:18:14.798261 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovn-node-metrics-cert\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798629 master-0 kubenswrapper[4010]: I0319 09:18:14.798336 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-script-lib\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798629 master-0 kubenswrapper[4010]: I0319 09:18:14.798394 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798629 master-0 kubenswrapper[4010]: I0319 09:18:14.798432 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798629 master-0 kubenswrapper[4010]: I0319 09:18:14.798493 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-systemd\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.798629 master-0 kubenswrapper[4010]: I0319 09:18:14.798525 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-netns\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899543 master-0 kubenswrapper[4010]: I0319 09:18:14.899406 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-node-log\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899543 master-0 kubenswrapper[4010]: I0319 09:18:14.899442 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-systemd-units\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899543 master-0 kubenswrapper[4010]: I0319 09:18:14.899460 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-env-overrides\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899543 master-0 kubenswrapper[4010]: I0319 09:18:14.899495 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-ovn\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899570 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-node-log\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899635 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-systemd-units\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899633 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-kubelet\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899666 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-kubelet\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899668 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqfx\" (UniqueName: \"kubernetes.io/projected/57ac3af0-7d16-4715-9afa-6e98a2777e6e-kube-api-access-gfqfx\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899709 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-netd\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899729 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-slash\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899746 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-var-lib-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899762 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-log-socket\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899777 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-bin\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.899788 master-0 kubenswrapper[4010]: I0319 09:18:14.899794 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovn-node-metrics-cert\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899827 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-script-lib\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899854 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899872 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899891 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-netns\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899906 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-systemd\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899930 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-etc-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899947 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900263 master-0 kubenswrapper[4010]: I0319 09:18:14.899961 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-config\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900778 master-0 kubenswrapper[4010]: I0319 09:18:14.900682 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-ovn\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900778 master-0 kubenswrapper[4010]: I0319 09:18:14.900683 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-config\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900778 master-0 kubenswrapper[4010]: I0319 09:18:14.900731 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-var-lib-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900778 master-0 kubenswrapper[4010]: I0319 09:18:14.900760 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-systemd\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.900778 master-0 kubenswrapper[4010]: I0319 09:18:14.900770 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-log-socket\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.900771 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-netns\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.900800 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-env-overrides\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.900818 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-slash\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.900846 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-bin\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.900914 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-netd\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.900961 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-etc-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901004 master-0 kubenswrapper[4010]: I0319 09:18:14.901000 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901270 master-0 kubenswrapper[4010]: I0319 09:18:14.901048 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901270 master-0 kubenswrapper[4010]: I0319 09:18:14.901076 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-openvswitch\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.901350 master-0 kubenswrapper[4010]: I0319 09:18:14.901299 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-script-lib\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.905501 master-0 kubenswrapper[4010]: I0319 09:18:14.905437 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovn-node-metrics-cert\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.916436 master-0 kubenswrapper[4010]: I0319 09:18:14.916119 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqfx\" (UniqueName: \"kubernetes.io/projected/57ac3af0-7d16-4715-9afa-6e98a2777e6e-kube-api-access-gfqfx\") pod \"ovnkube-node-22clf\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.938649 master-0 kubenswrapper[4010]: I0319 09:18:14.938598 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:14.953869 master-0 kubenswrapper[4010]: W0319 09:18:14.953820 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ac3af0_7d16_4715_9afa_6e98a2777e6e.slice/crio-c283b976ccff2f081d129aad2281421561a14a7be4a6f3749d2de0cb2ccb0b0b WatchSource:0}: Error finding container c283b976ccff2f081d129aad2281421561a14a7be4a6f3749d2de0cb2ccb0b0b: Status 404 returned error can't find the container with id c283b976ccff2f081d129aad2281421561a14a7be4a6f3749d2de0cb2ccb0b0b Mar 19 09:18:15.561151 master-0 kubenswrapper[4010]: I0319 09:18:15.561077 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svct" event={"ID":"872e5f8c-b014-4283-a4d2-0e2cfd29e192","Type":"ContainerStarted","Data":"b504737085975340ca235cec0c4c9e74b2eb5d8b9a50455476ac176eb78b4a5c"} Mar 19 09:18:15.564092 master-0 kubenswrapper[4010]: I0319 09:18:15.564007 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" event={"ID":"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5","Type":"ContainerStarted","Data":"dca5ac8d9d3dfc36323d8bdbf50ed76474a98b11d8b57cf5b84a987a3d2693f2"} Mar 19 09:18:15.564092 master-0 kubenswrapper[4010]: I0319 09:18:15.564071 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" event={"ID":"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5","Type":"ContainerStarted","Data":"ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43"} Mar 19 09:18:15.566604 master-0 kubenswrapper[4010]: I0319 09:18:15.566544 4010 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="7335f4e870393336ecca59a320d7b43e9c33ca895a7a0816d7e753f6c020f7af" exitCode=0 Mar 19 09:18:15.566751 master-0 kubenswrapper[4010]: I0319 09:18:15.566591 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerDied","Data":"7335f4e870393336ecca59a320d7b43e9c33ca895a7a0816d7e753f6c020f7af"} Mar 19 09:18:15.568009 master-0 kubenswrapper[4010]: I0319 09:18:15.567900 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"c283b976ccff2f081d129aad2281421561a14a7be4a6f3749d2de0cb2ccb0b0b"} Mar 19 09:18:15.617764 master-0 kubenswrapper[4010]: I0319 09:18:15.617169 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8svct" podStartSLOduration=1.255394099 podStartE2EDuration="13.617138863s" podCreationTimestamp="2026-03-19 09:18:02 +0000 UTC" firstStartedPulling="2026-03-19 09:18:02.359975592 +0000 UTC m=+61.885920199" lastFinishedPulling="2026-03-19 09:18:14.721720356 +0000 UTC m=+74.247664963" observedRunningTime="2026-03-19 09:18:15.617033319 +0000 UTC m=+75.142977926" watchObservedRunningTime="2026-03-19 09:18:15.617138863 +0000 UTC m=+75.143083500" Mar 19 09:18:16.110492 master-0 kubenswrapper[4010]: I0319 09:18:16.110389 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:18:16.110700 master-0 kubenswrapper[4010]: E0319 09:18:16.110628 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:16.110767 master-0 kubenswrapper[4010]: E0319 09:18:16.110712 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:48.110694225 +0000 UTC m=+107.636638832 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:16.227023 master-0 kubenswrapper[4010]: I0319 09:18:16.226965 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:16.227214 master-0 kubenswrapper[4010]: E0319 09:18:16.227109 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:17.574407 master-0 kubenswrapper[4010]: I0319 09:18:17.574357 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerStarted","Data":"665177f0301e1fc60d7ae832223fecb7c16c65e7cc5cfa86a5c6a63e7efdc407"} Mar 19 09:18:17.603373 master-0 kubenswrapper[4010]: I0319 09:18:17.603246 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-95w9b"] Mar 19 09:18:17.603628 master-0 kubenswrapper[4010]: I0319 09:18:17.603575 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:17.603680 master-0 kubenswrapper[4010]: E0319 09:18:17.603630 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:17.724682 master-0 kubenswrapper[4010]: I0319 09:18:17.724641 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:17.826218 master-0 kubenswrapper[4010]: I0319 09:18:17.826172 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:17.839764 master-0 kubenswrapper[4010]: E0319 09:18:17.839735 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:17.839764 master-0 kubenswrapper[4010]: E0319 09:18:17.839765 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:17.839873 master-0 kubenswrapper[4010]: E0319 09:18:17.839775 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:17.839873 master-0 kubenswrapper[4010]: E0319 09:18:17.839827 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:18.339812889 +0000 UTC m=+77.865757496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:18.226697 master-0 kubenswrapper[4010]: I0319 09:18:18.226642 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:18.226891 master-0 kubenswrapper[4010]: E0319 09:18:18.226780 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:18.430486 master-0 kubenswrapper[4010]: I0319 09:18:18.430420 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:18.430650 master-0 kubenswrapper[4010]: E0319 09:18:18.430616 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:18.430650 master-0 kubenswrapper[4010]: E0319 09:18:18.430645 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:18.430715 master-0 kubenswrapper[4010]: E0319 09:18:18.430656 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:18.430746 master-0 kubenswrapper[4010]: E0319 09:18:18.430719 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:19.430699531 +0000 UTC m=+78.956644148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:18.579877 master-0 kubenswrapper[4010]: I0319 09:18:18.579768 4010 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="665177f0301e1fc60d7ae832223fecb7c16c65e7cc5cfa86a5c6a63e7efdc407" exitCode=0 Mar 19 09:18:18.579877 master-0 kubenswrapper[4010]: I0319 09:18:18.579815 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerDied","Data":"665177f0301e1fc60d7ae832223fecb7c16c65e7cc5cfa86a5c6a63e7efdc407"} Mar 19 09:18:18.834193 master-0 kubenswrapper[4010]: I0319 09:18:18.834042 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:18.834378 master-0 kubenswrapper[4010]: E0319 09:18:18.834300 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:18.834461 master-0 kubenswrapper[4010]: E0319 09:18:18.834429 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:18:34.834395199 +0000 UTC m=+94.360339806 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:19.227418 master-0 kubenswrapper[4010]: I0319 09:18:19.227131 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:19.227418 master-0 kubenswrapper[4010]: E0319 09:18:19.227388 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:19.239425 master-0 kubenswrapper[4010]: W0319 09:18:19.239376 4010 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 09:18:19.239651 master-0 kubenswrapper[4010]: I0319 09:18:19.239629 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:18:19.439826 master-0 kubenswrapper[4010]: I0319 09:18:19.439748 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:19.440138 master-0 kubenswrapper[4010]: E0319 09:18:19.440081 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:19.440138 master-0 kubenswrapper[4010]: E0319 09:18:19.440134 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:19.440268 master-0 kubenswrapper[4010]: E0319 09:18:19.440151 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:19.440268 master-0 kubenswrapper[4010]: E0319 09:18:19.440244 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:21.440218974 +0000 UTC m=+80.966163581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:20.224712 master-0 kubenswrapper[4010]: I0319 09:18:20.224668 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-kqb2h"] Mar 19 09:18:20.225143 master-0 kubenswrapper[4010]: I0319 09:18:20.224969 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.226666 master-0 kubenswrapper[4010]: I0319 09:18:20.226615 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:18:20.227130 master-0 kubenswrapper[4010]: I0319 09:18:20.227091 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:20.227234 master-0 kubenswrapper[4010]: I0319 09:18:20.227207 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:18:20.227283 master-0 kubenswrapper[4010]: I0319 09:18:20.227262 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:18:20.227344 master-0 kubenswrapper[4010]: E0319 09:18:20.227275 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:20.227389 master-0 kubenswrapper[4010]: I0319 09:18:20.227380 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:18:20.227428 master-0 kubenswrapper[4010]: I0319 09:18:20.227407 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:18:20.240382 master-0 kubenswrapper[4010]: I0319 09:18:20.240306 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=1.2402865969999999 podStartE2EDuration="1.240286597s" podCreationTimestamp="2026-03-19 09:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:18:20.238454034 +0000 UTC m=+79.764398641" watchObservedRunningTime="2026-03-19 09:18:20.240286597 +0000 UTC m=+79.766231224" Mar 19 09:18:20.349487 master-0 kubenswrapper[4010]: I0319 09:18:20.349311 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.349487 master-0 kubenswrapper[4010]: I0319 09:18:20.349387 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.349487 master-0 kubenswrapper[4010]: I0319 09:18:20.349409 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.349487 master-0 kubenswrapper[4010]: I0319 09:18:20.349429 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.451006 master-0 kubenswrapper[4010]: I0319 09:18:20.450477 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.451006 master-0 kubenswrapper[4010]: E0319 09:18:20.450645 4010 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 19 09:18:20.451006 master-0 kubenswrapper[4010]: I0319 09:18:20.450667 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.451006 master-0 kubenswrapper[4010]: I0319 09:18:20.450705 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.451006 master-0 kubenswrapper[4010]: E0319 09:18:20.450733 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert podName:b2898746-6827-41d9-ac88-64206cb84ac9 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:20.950708795 +0000 UTC m=+80.476653402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert") pod "network-node-identity-kqb2h" (UID: "b2898746-6827-41d9-ac88-64206cb84ac9") : secret "network-node-identity-cert" not found Mar 19 09:18:20.451006 master-0 kubenswrapper[4010]: I0319 09:18:20.450798 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.451909 master-0 kubenswrapper[4010]: I0319 09:18:20.451867 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.452026 master-0 kubenswrapper[4010]: I0319 09:18:20.451992 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.840774 master-0 kubenswrapper[4010]: I0319 09:18:20.832620 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.954257 master-0 kubenswrapper[4010]: I0319 09:18:20.954170 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:20.954457 master-0 kubenswrapper[4010]: E0319 09:18:20.954381 4010 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 19 09:18:20.954528 master-0 kubenswrapper[4010]: E0319 09:18:20.954483 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert podName:b2898746-6827-41d9-ac88-64206cb84ac9 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:21.954448782 +0000 UTC m=+81.480393389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert") pod "network-node-identity-kqb2h" (UID: "b2898746-6827-41d9-ac88-64206cb84ac9") : secret "network-node-identity-cert" not found Mar 19 09:18:21.226989 master-0 kubenswrapper[4010]: I0319 09:18:21.226935 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:21.227666 master-0 kubenswrapper[4010]: E0319 09:18:21.227560 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:21.458588 master-0 kubenswrapper[4010]: I0319 09:18:21.458526 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:21.458758 master-0 kubenswrapper[4010]: E0319 09:18:21.458692 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:21.458758 master-0 kubenswrapper[4010]: E0319 09:18:21.458707 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:21.458758 master-0 kubenswrapper[4010]: E0319 09:18:21.458717 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:21.458842 master-0 kubenswrapper[4010]: E0319 09:18:21.458764 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:25.458751965 +0000 UTC m=+84.984696572 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:21.589697 master-0 kubenswrapper[4010]: I0319 09:18:21.589565 4010 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="b1fd1a1a09332960aaf03f0be319bfd31ad0e612d2387b20f773844856dcefe5" exitCode=0 Mar 19 09:18:21.589697 master-0 kubenswrapper[4010]: I0319 09:18:21.589614 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerDied","Data":"b1fd1a1a09332960aaf03f0be319bfd31ad0e612d2387b20f773844856dcefe5"} Mar 19 09:18:21.963332 master-0 kubenswrapper[4010]: I0319 09:18:21.963280 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:21.965914 master-0 kubenswrapper[4010]: I0319 09:18:21.965887 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:22.040567 master-0 kubenswrapper[4010]: I0319 09:18:22.040483 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:18:22.052566 master-0 kubenswrapper[4010]: W0319 09:18:22.052530 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2898746_6827_41d9_ac88_64206cb84ac9.slice/crio-7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42 WatchSource:0}: Error finding container 7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42: Status 404 returned error can't find the container with id 7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42 Mar 19 09:18:22.226864 master-0 kubenswrapper[4010]: I0319 09:18:22.226775 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:22.227025 master-0 kubenswrapper[4010]: E0319 09:18:22.226900 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:22.594795 master-0 kubenswrapper[4010]: I0319 09:18:22.594559 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerStarted","Data":"7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42"} Mar 19 09:18:23.226352 master-0 kubenswrapper[4010]: I0319 09:18:23.226306 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:23.226579 master-0 kubenswrapper[4010]: E0319 09:18:23.226416 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:24.226563 master-0 kubenswrapper[4010]: I0319 09:18:24.226496 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:24.227109 master-0 kubenswrapper[4010]: E0319 09:18:24.226658 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:25.227786 master-0 kubenswrapper[4010]: I0319 09:18:25.227736 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:25.228209 master-0 kubenswrapper[4010]: E0319 09:18:25.227879 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:25.496176 master-0 kubenswrapper[4010]: I0319 09:18:25.496078 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:25.496339 master-0 kubenswrapper[4010]: E0319 09:18:25.496290 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:25.496339 master-0 kubenswrapper[4010]: E0319 09:18:25.496319 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:25.496339 master-0 kubenswrapper[4010]: E0319 09:18:25.496331 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:25.496445 master-0 kubenswrapper[4010]: E0319 09:18:25.496385 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:33.496369092 +0000 UTC m=+93.022313699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:26.226532 master-0 kubenswrapper[4010]: I0319 09:18:26.226435 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:26.226728 master-0 kubenswrapper[4010]: E0319 09:18:26.226569 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:27.226635 master-0 kubenswrapper[4010]: I0319 09:18:27.226421 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:27.227308 master-0 kubenswrapper[4010]: E0319 09:18:27.226706 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:28.226783 master-0 kubenswrapper[4010]: I0319 09:18:28.226699 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:28.227385 master-0 kubenswrapper[4010]: E0319 09:18:28.227174 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:29.227380 master-0 kubenswrapper[4010]: I0319 09:18:29.227303 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:29.228612 master-0 kubenswrapper[4010]: E0319 09:18:29.227429 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:29.303583 master-0 kubenswrapper[4010]: I0319 09:18:29.303186 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:18:30.227124 master-0 kubenswrapper[4010]: I0319 09:18:30.227047 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:30.227331 master-0 kubenswrapper[4010]: E0319 09:18:30.227202 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:31.226888 master-0 kubenswrapper[4010]: I0319 09:18:31.226848 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:31.230576 master-0 kubenswrapper[4010]: E0319 09:18:31.230505 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:32.234149 master-0 kubenswrapper[4010]: I0319 09:18:32.234065 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:32.234946 master-0 kubenswrapper[4010]: E0319 09:18:32.234352 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:32.234946 master-0 kubenswrapper[4010]: I0319 09:18:32.234670 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=4.234647908 podStartE2EDuration="4.234647908s" podCreationTimestamp="2026-03-19 09:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:18:32.234272508 +0000 UTC m=+91.760217115" watchObservedRunningTime="2026-03-19 09:18:32.234647908 +0000 UTC m=+91.760592515" Mar 19 09:18:33.227193 master-0 kubenswrapper[4010]: I0319 09:18:33.226926 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:33.227193 master-0 kubenswrapper[4010]: E0319 09:18:33.227158 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:33.567763 master-0 kubenswrapper[4010]: I0319 09:18:33.567528 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:33.568217 master-0 kubenswrapper[4010]: E0319 09:18:33.567764 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:33.568217 master-0 kubenswrapper[4010]: E0319 09:18:33.567803 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:33.568217 master-0 kubenswrapper[4010]: E0319 09:18:33.567817 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:33.568217 master-0 kubenswrapper[4010]: E0319 09:18:33.567878 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:18:49.567861041 +0000 UTC m=+109.093805648 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:34.226620 master-0 kubenswrapper[4010]: I0319 09:18:34.226562 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:34.226826 master-0 kubenswrapper[4010]: E0319 09:18:34.226736 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:34.882484 master-0 kubenswrapper[4010]: I0319 09:18:34.882394 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:34.883024 master-0 kubenswrapper[4010]: E0319 09:18:34.882615 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:34.883024 master-0 kubenswrapper[4010]: E0319 09:18:34.882704 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:19:06.88268395 +0000 UTC m=+126.408628617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:18:35.228503 master-0 kubenswrapper[4010]: I0319 09:18:35.228422 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:35.228830 master-0 kubenswrapper[4010]: E0319 09:18:35.228541 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:36.226584 master-0 kubenswrapper[4010]: I0319 09:18:36.226522 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:36.228091 master-0 kubenswrapper[4010]: E0319 09:18:36.226649 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:37.226766 master-0 kubenswrapper[4010]: I0319 09:18:37.226691 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:37.228032 master-0 kubenswrapper[4010]: E0319 09:18:37.226873 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:38.227099 master-0 kubenswrapper[4010]: I0319 09:18:38.227035 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:38.227598 master-0 kubenswrapper[4010]: E0319 09:18:38.227185 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:38.629358 master-0 kubenswrapper[4010]: I0319 09:18:38.629121 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:18:39.227371 master-0 kubenswrapper[4010]: I0319 09:18:39.227267 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:39.227939 master-0 kubenswrapper[4010]: E0319 09:18:39.227545 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:39.641340 master-0 kubenswrapper[4010]: I0319 09:18:39.640651 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerStarted","Data":"a5f501670eb3ea46a2e9833a8efe0358489fe82196edec8a883f420d084aeb16"} Mar 19 09:18:39.658235 master-0 kubenswrapper[4010]: I0319 09:18:39.658126 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.658100747 podStartE2EDuration="1.658100747s" podCreationTimestamp="2026-03-19 09:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:18:39.657445798 +0000 UTC m=+99.183390415" watchObservedRunningTime="2026-03-19 09:18:39.658100747 +0000 UTC m=+99.184045364" Mar 19 09:18:40.226930 master-0 kubenswrapper[4010]: I0319 09:18:40.226827 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:40.227288 master-0 kubenswrapper[4010]: E0319 09:18:40.227078 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:40.648848 master-0 kubenswrapper[4010]: I0319 09:18:40.648625 4010 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="a5f501670eb3ea46a2e9833a8efe0358489fe82196edec8a883f420d084aeb16" exitCode=0 Mar 19 09:18:40.648848 master-0 kubenswrapper[4010]: I0319 09:18:40.648690 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerDied","Data":"a5f501670eb3ea46a2e9833a8efe0358489fe82196edec8a883f420d084aeb16"} Mar 19 09:18:40.650572 master-0 kubenswrapper[4010]: I0319 09:18:40.650534 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" event={"ID":"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5","Type":"ContainerStarted","Data":"7625e1722b2e3b80ecf85f84a7ed20af518fcbf3270f5b73f90321c127613131"} Mar 19 09:18:40.652997 master-0 kubenswrapper[4010]: I0319 09:18:40.652913 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerStarted","Data":"5f66b7b4498be8ffcef1be07d5415ae49ca99cf0c15b74518d97c2537613d5cc"} Mar 19 09:18:40.652997 master-0 kubenswrapper[4010]: I0319 09:18:40.652984 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerStarted","Data":"0b1563900a6630ed57c4439686a9e7bd79e7fc5f59be3ea1893b76d89e5bc81f"} Mar 19 09:18:40.654354 master-0 kubenswrapper[4010]: I0319 09:18:40.654290 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="f3f14910b909c8727132f1ac9221bf9b5690b3430ea17e089e4840574b473f78" exitCode=0 Mar 19 09:18:40.654354 master-0 kubenswrapper[4010]: I0319 09:18:40.654326 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"f3f14910b909c8727132f1ac9221bf9b5690b3430ea17e089e4840574b473f78"} Mar 19 09:18:40.736784 master-0 kubenswrapper[4010]: I0319 09:18:40.735772 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-kqb2h" podStartSLOduration=3.155851592 podStartE2EDuration="20.735732634s" podCreationTimestamp="2026-03-19 09:18:20 +0000 UTC" firstStartedPulling="2026-03-19 09:18:22.054083686 +0000 UTC m=+81.580028293" lastFinishedPulling="2026-03-19 09:18:39.633964718 +0000 UTC m=+99.159909335" observedRunningTime="2026-03-19 09:18:40.717852256 +0000 UTC m=+100.243796873" watchObservedRunningTime="2026-03-19 09:18:40.735732634 +0000 UTC m=+100.261677251" Mar 19 09:18:41.227603 master-0 kubenswrapper[4010]: I0319 09:18:41.226781 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:41.228039 master-0 kubenswrapper[4010]: E0319 09:18:41.227944 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:41.662295 master-0 kubenswrapper[4010]: I0319 09:18:41.661541 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e"} Mar 19 09:18:41.662295 master-0 kubenswrapper[4010]: I0319 09:18:41.662142 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"4f4c4c7bacb4c82526cc0b717400eed5575e7255b8fc41ce95e4db61be21ac21"} Mar 19 09:18:41.662295 master-0 kubenswrapper[4010]: I0319 09:18:41.662170 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"f7e4a347ce2589cef46085f03e6a3c4fbde90d30b9b65bf4f08fba449a466100"} Mar 19 09:18:41.662295 master-0 kubenswrapper[4010]: I0319 09:18:41.662233 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"201c6766340c20cd2107ef0e5bda47bb093b5e2cb7e924c58384097520fb652a"} Mar 19 09:18:41.662295 master-0 kubenswrapper[4010]: I0319 09:18:41.662252 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"1980c8d39693f82bcc98b960571af6d122ab09b758b31f3eb0468f4d6840dac2"} Mar 19 09:18:41.662295 master-0 kubenswrapper[4010]: I0319 09:18:41.662271 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"0486d5600b5ffa9af48869cd91a73a55fed8856efb519934adb831f0de1d5b12"} Mar 19 09:18:41.667059 master-0 kubenswrapper[4010]: I0319 09:18:41.667005 4010 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="dee05648403cf8d6ee35acba18e21f4c87a759e5c8fc08c0570622f3df3f33e1" exitCode=0 Mar 19 09:18:41.667263 master-0 kubenswrapper[4010]: I0319 09:18:41.667195 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerDied","Data":"dee05648403cf8d6ee35acba18e21f4c87a759e5c8fc08c0570622f3df3f33e1"} Mar 19 09:18:41.693896 master-0 kubenswrapper[4010]: I0319 09:18:41.693822 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" podStartSLOduration=3.032506106 podStartE2EDuration="27.693804356s" podCreationTimestamp="2026-03-19 09:18:14 +0000 UTC" firstStartedPulling="2026-03-19 09:18:14.933618036 +0000 UTC m=+74.459562643" lastFinishedPulling="2026-03-19 09:18:39.594916276 +0000 UTC m=+99.120860893" observedRunningTime="2026-03-19 09:18:40.736863916 +0000 UTC m=+100.262808533" watchObservedRunningTime="2026-03-19 09:18:41.693804356 +0000 UTC m=+101.219748963" Mar 19 09:18:41.694268 master-0 kubenswrapper[4010]: I0319 09:18:41.694238 4010 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-22clf"] Mar 19 09:18:42.227009 master-0 kubenswrapper[4010]: I0319 09:18:42.226928 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:42.227325 master-0 kubenswrapper[4010]: E0319 09:18:42.227254 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:42.676656 master-0 kubenswrapper[4010]: I0319 09:18:42.676427 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" event={"ID":"e9ebcecb-c210-434e-83a1-825265e206f1","Type":"ContainerStarted","Data":"0786d185e3954f27e19e050bbd7c235bc8cb77475d30e2d9974c7938c812a015"} Mar 19 09:18:43.226680 master-0 kubenswrapper[4010]: I0319 09:18:43.226592 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:43.226997 master-0 kubenswrapper[4010]: E0319 09:18:43.226726 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:43.688874 master-0 kubenswrapper[4010]: I0319 09:18:43.687830 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1"} Mar 19 09:18:44.226923 master-0 kubenswrapper[4010]: I0319 09:18:44.226831 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:44.227188 master-0 kubenswrapper[4010]: E0319 09:18:44.227118 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:45.227250 master-0 kubenswrapper[4010]: I0319 09:18:45.227132 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:45.228805 master-0 kubenswrapper[4010]: E0319 09:18:45.227294 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:46.226938 master-0 kubenswrapper[4010]: I0319 09:18:46.226838 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:46.227179 master-0 kubenswrapper[4010]: E0319 09:18:46.227005 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:46.701660 master-0 kubenswrapper[4010]: I0319 09:18:46.701231 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerStarted","Data":"6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc"} Mar 19 09:18:46.701660 master-0 kubenswrapper[4010]: I0319 09:18:46.701585 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" containerID="cri-o://37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" gracePeriod=30 Mar 19 09:18:46.701660 master-0 kubenswrapper[4010]: I0319 09:18:46.701612 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f7e4a347ce2589cef46085f03e6a3c4fbde90d30b9b65bf4f08fba449a466100" gracePeriod=30 Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701733 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="northd" containerID="cri-o://4f4c4c7bacb4c82526cc0b717400eed5575e7255b8fc41ce95e4db61be21ac21" gracePeriod=30 Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701761 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701795 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701751 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-node" containerID="cri-o://201c6766340c20cd2107ef0e5bda47bb093b5e2cb7e924c58384097520fb652a" gracePeriod=30 Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701852 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-controller" containerID="cri-o://0486d5600b5ffa9af48869cd91a73a55fed8856efb519934adb831f0de1d5b12" gracePeriod=30 Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701908 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" containerID="cri-o://eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" gracePeriod=30 Mar 19 09:18:46.703361 master-0 kubenswrapper[4010]: I0319 09:18:46.701800 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-acl-logging" containerID="cri-o://1980c8d39693f82bcc98b960571af6d122ab09b758b31f3eb0468f4d6840dac2" gracePeriod=30 Mar 19 09:18:46.704817 master-0 kubenswrapper[4010]: E0319 09:18:46.704627 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:18:46.706664 master-0 kubenswrapper[4010]: E0319 09:18:46.705985 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:18:46.709861 master-0 kubenswrapper[4010]: E0319 09:18:46.709066 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:18:46.709861 master-0 kubenswrapper[4010]: E0319 09:18:46.709128 4010 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:18:46.720271 master-0 kubenswrapper[4010]: I0319 09:18:46.720183 4010 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovnkube-controller" containerID="cri-o://6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc" gracePeriod=30 Mar 19 09:18:47.227212 master-0 kubenswrapper[4010]: I0319 09:18:47.227042 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:47.227212 master-0 kubenswrapper[4010]: E0319 09:18:47.227184 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:47.706352 master-0 kubenswrapper[4010]: I0319 09:18:47.706300 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:18:47.707415 master-0 kubenswrapper[4010]: I0319 09:18:47.707388 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-node/0.log" Mar 19 09:18:47.708165 master-0 kubenswrapper[4010]: I0319 09:18:47.708110 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-acl-logging/0.log" Mar 19 09:18:47.708947 master-0 kubenswrapper[4010]: I0319 09:18:47.708923 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-controller/0.log" Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709416 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" exitCode=0 Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709477 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1"} Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709507 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="f7e4a347ce2589cef46085f03e6a3c4fbde90d30b9b65bf4f08fba449a466100" exitCode=143 Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709545 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="201c6766340c20cd2107ef0e5bda47bb093b5e2cb7e924c58384097520fb652a" exitCode=143 Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709565 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="1980c8d39693f82bcc98b960571af6d122ab09b758b31f3eb0468f4d6840dac2" exitCode=143 Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709582 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="0486d5600b5ffa9af48869cd91a73a55fed8856efb519934adb831f0de1d5b12" exitCode=143 Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709516 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"f7e4a347ce2589cef46085f03e6a3c4fbde90d30b9b65bf4f08fba449a466100"} Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709649 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"201c6766340c20cd2107ef0e5bda47bb093b5e2cb7e924c58384097520fb652a"} Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709674 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"1980c8d39693f82bcc98b960571af6d122ab09b758b31f3eb0468f4d6840dac2"} Mar 19 09:18:47.709814 master-0 kubenswrapper[4010]: I0319 09:18:47.709697 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"0486d5600b5ffa9af48869cd91a73a55fed8856efb519934adb831f0de1d5b12"} Mar 19 09:18:48.119452 master-0 kubenswrapper[4010]: I0319 09:18:48.119176 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:18:48.119452 master-0 kubenswrapper[4010]: E0319 09:18:48.119335 4010 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:48.119452 master-0 kubenswrapper[4010]: E0319 09:18:48.119405 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:52.11938785 +0000 UTC m=+171.645332467 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:18:48.226650 master-0 kubenswrapper[4010]: I0319 09:18:48.226598 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:48.226806 master-0 kubenswrapper[4010]: E0319 09:18:48.226775 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:48.991457 master-0 kubenswrapper[4010]: I0319 09:18:48.991407 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:18:48.992087 master-0 kubenswrapper[4010]: I0319 09:18:48.991829 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-node/0.log" Mar 19 09:18:48.992242 master-0 kubenswrapper[4010]: I0319 09:18:48.992206 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-acl-logging/0.log" Mar 19 09:18:48.992630 master-0 kubenswrapper[4010]: I0319 09:18:48.992597 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-controller/0.log" Mar 19 09:18:48.992967 master-0 kubenswrapper[4010]: I0319 09:18:48.992935 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" exitCode=0 Mar 19 09:18:48.992967 master-0 kubenswrapper[4010]: I0319 09:18:48.992962 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="4f4c4c7bacb4c82526cc0b717400eed5575e7255b8fc41ce95e4db61be21ac21" exitCode=0 Mar 19 09:18:48.993096 master-0 kubenswrapper[4010]: I0319 09:18:48.992982 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e"} Mar 19 09:18:48.993096 master-0 kubenswrapper[4010]: I0319 09:18:48.993007 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"4f4c4c7bacb4c82526cc0b717400eed5575e7255b8fc41ce95e4db61be21ac21"} Mar 19 09:18:49.023191 master-0 kubenswrapper[4010]: I0319 09:18:49.023130 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tjzdb" podStartSLOduration=10.306529323 podStartE2EDuration="47.023115157s" podCreationTimestamp="2026-03-19 09:18:02 +0000 UTC" firstStartedPulling="2026-03-19 09:18:02.559282427 +0000 UTC m=+62.085227044" lastFinishedPulling="2026-03-19 09:18:39.275868261 +0000 UTC m=+98.801812878" observedRunningTime="2026-03-19 09:18:42.700041173 +0000 UTC m=+102.225985780" watchObservedRunningTime="2026-03-19 09:18:49.023115157 +0000 UTC m=+108.549059764" Mar 19 09:18:49.023340 master-0 kubenswrapper[4010]: I0319 09:18:49.023250 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podStartSLOduration=10.21352158 podStartE2EDuration="35.023247242s" podCreationTimestamp="2026-03-19 09:18:14 +0000 UTC" firstStartedPulling="2026-03-19 09:18:14.956131528 +0000 UTC m=+74.482076145" lastFinishedPulling="2026-03-19 09:18:39.7658572 +0000 UTC m=+99.291801807" observedRunningTime="2026-03-19 09:18:49.023185859 +0000 UTC m=+108.549130476" watchObservedRunningTime="2026-03-19 09:18:49.023247242 +0000 UTC m=+108.549191849" Mar 19 09:18:49.227617 master-0 kubenswrapper[4010]: I0319 09:18:49.227039 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:49.227617 master-0 kubenswrapper[4010]: E0319 09:18:49.227180 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:49.588895 master-0 kubenswrapper[4010]: I0319 09:18:49.588813 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:49.589814 master-0 kubenswrapper[4010]: E0319 09:18:49.589760 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:18:49.589814 master-0 kubenswrapper[4010]: E0319 09:18:49.589803 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:18:49.590028 master-0 kubenswrapper[4010]: E0319 09:18:49.589821 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:49.590028 master-0 kubenswrapper[4010]: E0319 09:18:49.589912 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:21.589889631 +0000 UTC m=+141.115834228 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:18:50.226400 master-0 kubenswrapper[4010]: I0319 09:18:50.226332 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:50.226879 master-0 kubenswrapper[4010]: E0319 09:18:50.226508 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:51.226910 master-0 kubenswrapper[4010]: I0319 09:18:51.226838 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:51.230586 master-0 kubenswrapper[4010]: E0319 09:18:51.227986 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:52.226278 master-0 kubenswrapper[4010]: I0319 09:18:52.226202 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:52.226520 master-0 kubenswrapper[4010]: E0319 09:18:52.226357 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:53.226560 master-0 kubenswrapper[4010]: I0319 09:18:53.226460 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:53.227056 master-0 kubenswrapper[4010]: E0319 09:18:53.226626 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:54.226503 master-0 kubenswrapper[4010]: I0319 09:18:54.226429 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:54.226748 master-0 kubenswrapper[4010]: E0319 09:18:54.226568 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:54.939460 master-0 kubenswrapper[4010]: I0319 09:18:54.939401 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:18:54.940027 master-0 kubenswrapper[4010]: E0319 09:18:54.939838 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:18:54.940027 master-0 kubenswrapper[4010]: E0319 09:18:54.939908 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:18:54.940370 master-0 kubenswrapper[4010]: E0319 09:18:54.940323 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:18:54.940733 master-0 kubenswrapper[4010]: E0319 09:18:54.940578 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:18:54.940733 master-0 kubenswrapper[4010]: E0319 09:18:54.940690 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:18:54.940733 master-0 kubenswrapper[4010]: E0319 09:18:54.940724 4010 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:18:54.940962 master-0 kubenswrapper[4010]: E0319 09:18:54.940924 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:18:54.940962 master-0 kubenswrapper[4010]: E0319 09:18:54.940953 4010 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" Mar 19 09:18:55.226951 master-0 kubenswrapper[4010]: I0319 09:18:55.226822 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:55.227386 master-0 kubenswrapper[4010]: E0319 09:18:55.227025 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:56.226352 master-0 kubenswrapper[4010]: I0319 09:18:56.226268 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:56.226565 master-0 kubenswrapper[4010]: E0319 09:18:56.226438 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:57.227307 master-0 kubenswrapper[4010]: I0319 09:18:57.227201 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:57.227802 master-0 kubenswrapper[4010]: E0319 09:18:57.227404 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:18:58.227077 master-0 kubenswrapper[4010]: I0319 09:18:58.226980 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:18:58.228661 master-0 kubenswrapper[4010]: E0319 09:18:58.227158 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:18:58.283618 master-0 kubenswrapper[4010]: I0319 09:18:58.283530 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:18:59.226938 master-0 kubenswrapper[4010]: I0319 09:18:59.226847 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:18:59.227200 master-0 kubenswrapper[4010]: E0319 09:18:59.227146 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:00.227213 master-0 kubenswrapper[4010]: I0319 09:19:00.226872 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:00.227975 master-0 kubenswrapper[4010]: E0319 09:19:00.227266 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:01.031768 master-0 kubenswrapper[4010]: I0319 09:19:01.031700 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svct_872e5f8c-b014-4283-a4d2-0e2cfd29e192/kube-multus/0.log" Mar 19 09:19:01.032164 master-0 kubenswrapper[4010]: I0319 09:19:01.032135 4010 generic.go:334] "Generic (PLEG): container finished" podID="872e5f8c-b014-4283-a4d2-0e2cfd29e192" containerID="b504737085975340ca235cec0c4c9e74b2eb5d8b9a50455476ac176eb78b4a5c" exitCode=1 Mar 19 09:19:01.032260 master-0 kubenswrapper[4010]: I0319 09:19:01.032202 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svct" event={"ID":"872e5f8c-b014-4283-a4d2-0e2cfd29e192","Type":"ContainerDied","Data":"b504737085975340ca235cec0c4c9e74b2eb5d8b9a50455476ac176eb78b4a5c"} Mar 19 09:19:01.032975 master-0 kubenswrapper[4010]: I0319 09:19:01.032943 4010 scope.go:117] "RemoveContainer" containerID="b504737085975340ca235cec0c4c9e74b2eb5d8b9a50455476ac176eb78b4a5c" Mar 19 09:19:01.094453 master-0 kubenswrapper[4010]: E0319 09:19:01.094383 4010 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 09:19:01.226947 master-0 kubenswrapper[4010]: I0319 09:19:01.226909 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:01.227745 master-0 kubenswrapper[4010]: E0319 09:19:01.227701 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:01.506622 master-0 kubenswrapper[4010]: E0319 09:19:01.506534 4010 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:02.040150 master-0 kubenswrapper[4010]: I0319 09:19:02.040090 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svct_872e5f8c-b014-4283-a4d2-0e2cfd29e192/kube-multus/0.log" Mar 19 09:19:02.040494 master-0 kubenswrapper[4010]: I0319 09:19:02.040154 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svct" event={"ID":"872e5f8c-b014-4283-a4d2-0e2cfd29e192","Type":"ContainerStarted","Data":"0610380b3bb1f49357aaabe8c76260ca62aadb4cce00842ab471eef776e2bb79"} Mar 19 09:19:02.058013 master-0 kubenswrapper[4010]: I0319 09:19:02.057893 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=4.057866745 podStartE2EDuration="4.057866745s" podCreationTimestamp="2026-03-19 09:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:19:01.074620213 +0000 UTC m=+120.600564840" watchObservedRunningTime="2026-03-19 09:19:02.057866745 +0000 UTC m=+121.583811352" Mar 19 09:19:02.226596 master-0 kubenswrapper[4010]: I0319 09:19:02.226490 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:02.226961 master-0 kubenswrapper[4010]: E0319 09:19:02.226650 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:03.227247 master-0 kubenswrapper[4010]: I0319 09:19:03.227174 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:03.228304 master-0 kubenswrapper[4010]: E0319 09:19:03.227345 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:04.226775 master-0 kubenswrapper[4010]: I0319 09:19:04.226675 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:04.228241 master-0 kubenswrapper[4010]: E0319 09:19:04.227697 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:04.940208 master-0 kubenswrapper[4010]: E0319 09:19:04.940062 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:04.941277 master-0 kubenswrapper[4010]: E0319 09:19:04.940220 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:04.941277 master-0 kubenswrapper[4010]: E0319 09:19:04.940633 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:04.941277 master-0 kubenswrapper[4010]: E0319 09:19:04.941136 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:04.941763 master-0 kubenswrapper[4010]: E0319 09:19:04.941650 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:04.941763 master-0 kubenswrapper[4010]: E0319 09:19:04.941672 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:04.941763 master-0 kubenswrapper[4010]: E0319 09:19:04.941704 4010 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" Mar 19 09:19:04.941895 master-0 kubenswrapper[4010]: E0319 09:19:04.941752 4010 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:19:05.227126 master-0 kubenswrapper[4010]: I0319 09:19:05.226851 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:05.227425 master-0 kubenswrapper[4010]: E0319 09:19:05.227143 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:06.226323 master-0 kubenswrapper[4010]: I0319 09:19:06.226212 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:06.227418 master-0 kubenswrapper[4010]: E0319 09:19:06.226373 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:06.507696 master-0 kubenswrapper[4010]: E0319 09:19:06.507563 4010 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:06.942131 master-0 kubenswrapper[4010]: I0319 09:19:06.942020 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:06.942401 master-0 kubenswrapper[4010]: E0319 09:19:06.942222 4010 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:06.942401 master-0 kubenswrapper[4010]: E0319 09:19:06.942357 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:20:10.942302593 +0000 UTC m=+190.468247200 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:19:07.226971 master-0 kubenswrapper[4010]: I0319 09:19:07.226783 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:07.227523 master-0 kubenswrapper[4010]: E0319 09:19:07.226994 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:08.226709 master-0 kubenswrapper[4010]: I0319 09:19:08.226638 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:08.226934 master-0 kubenswrapper[4010]: E0319 09:19:08.226832 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:09.226924 master-0 kubenswrapper[4010]: I0319 09:19:09.226831 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:09.227630 master-0 kubenswrapper[4010]: E0319 09:19:09.226982 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:10.226612 master-0 kubenswrapper[4010]: I0319 09:19:10.226533 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:10.226829 master-0 kubenswrapper[4010]: E0319 09:19:10.226711 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:11.227127 master-0 kubenswrapper[4010]: I0319 09:19:11.226966 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:11.233459 master-0 kubenswrapper[4010]: E0319 09:19:11.231811 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:11.508418 master-0 kubenswrapper[4010]: E0319 09:19:11.508253 4010 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:12.226239 master-0 kubenswrapper[4010]: I0319 09:19:12.226170 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:12.226494 master-0 kubenswrapper[4010]: E0319 09:19:12.226308 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:13.226488 master-0 kubenswrapper[4010]: I0319 09:19:13.226437 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:13.226956 master-0 kubenswrapper[4010]: E0319 09:19:13.226578 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:14.226494 master-0 kubenswrapper[4010]: I0319 09:19:14.226417 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:14.227007 master-0 kubenswrapper[4010]: E0319 09:19:14.226564 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:14.940215 master-0 kubenswrapper[4010]: E0319 09:19:14.940150 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:14.940608 master-0 kubenswrapper[4010]: E0319 09:19:14.940573 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:14.941082 master-0 kubenswrapper[4010]: E0319 09:19:14.941054 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:14.941271 master-0 kubenswrapper[4010]: E0319 09:19:14.941244 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 19 09:19:14.941424 master-0 kubenswrapper[4010]: E0319 09:19:14.941300 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:14.941649 master-0 kubenswrapper[4010]: E0319 09:19:14.941602 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 19 09:19:14.941896 master-0 kubenswrapper[4010]: E0319 09:19:14.941656 4010 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" Mar 19 09:19:14.942519 master-0 kubenswrapper[4010]: E0319 09:19:14.942493 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 19 09:19:14.942625 master-0 kubenswrapper[4010]: E0319 09:19:14.942603 4010 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:19:14.942931 master-0 kubenswrapper[4010]: E0319 09:19:14.942896 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 19 09:19:14.944165 master-0 kubenswrapper[4010]: E0319 09:19:14.944143 4010 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Mar 19 09:19:14.944262 master-0 kubenswrapper[4010]: E0319 09:19:14.944244 4010 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovnkube-controller" Mar 19 09:19:15.227270 master-0 kubenswrapper[4010]: I0319 09:19:15.227130 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:15.227270 master-0 kubenswrapper[4010]: E0319 09:19:15.227254 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:16.226979 master-0 kubenswrapper[4010]: I0319 09:19:16.226886 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:16.227286 master-0 kubenswrapper[4010]: E0319 09:19:16.227044 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:16.510056 master-0 kubenswrapper[4010]: E0319 09:19:16.509880 4010 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:17.089777 master-0 kubenswrapper[4010]: I0319 09:19:17.089649 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovnkube-controller/0.log" Mar 19 09:19:17.091669 master-0 kubenswrapper[4010]: I0319 09:19:17.091637 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:19:17.092082 master-0 kubenswrapper[4010]: I0319 09:19:17.092057 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-node/0.log" Mar 19 09:19:17.092565 master-0 kubenswrapper[4010]: I0319 09:19:17.092531 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-acl-logging/0.log" Mar 19 09:19:17.093040 master-0 kubenswrapper[4010]: I0319 09:19:17.093023 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-controller/0.log" Mar 19 09:19:17.093451 master-0 kubenswrapper[4010]: I0319 09:19:17.093426 4010 generic.go:334] "Generic (PLEG): container finished" podID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerID="6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc" exitCode=137 Mar 19 09:19:17.093566 master-0 kubenswrapper[4010]: I0319 09:19:17.093549 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc"} Mar 19 09:19:17.226770 master-0 kubenswrapper[4010]: I0319 09:19:17.226724 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:17.227098 master-0 kubenswrapper[4010]: E0319 09:19:17.227069 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:17.302880 master-0 kubenswrapper[4010]: I0319 09:19:17.302860 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovnkube-controller/0.log" Mar 19 09:19:17.304596 master-0 kubenswrapper[4010]: I0319 09:19:17.304571 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:19:17.305147 master-0 kubenswrapper[4010]: I0319 09:19:17.305134 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-node/0.log" Mar 19 09:19:17.305798 master-0 kubenswrapper[4010]: I0319 09:19:17.305762 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-acl-logging/0.log" Mar 19 09:19:17.306356 master-0 kubenswrapper[4010]: I0319 09:19:17.306342 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-controller/0.log" Mar 19 09:19:17.307020 master-0 kubenswrapper[4010]: I0319 09:19:17.307005 4010 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:19:17.330785 master-0 kubenswrapper[4010]: I0319 09:19:17.330750 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-etc-openvswitch\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.330966 master-0 kubenswrapper[4010]: I0319 09:19:17.330951 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-ovn\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.331088 master-0 kubenswrapper[4010]: I0319 09:19:17.331075 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqfx\" (UniqueName: \"kubernetes.io/projected/57ac3af0-7d16-4715-9afa-6e98a2777e6e-kube-api-access-gfqfx\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.331179 master-0 kubenswrapper[4010]: I0319 09:19:17.331167 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-var-lib-openvswitch\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.331262 master-0 kubenswrapper[4010]: I0319 09:19:17.331250 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-netd\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.331433 master-0 kubenswrapper[4010]: I0319 09:19:17.331422 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-systemd-units\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.331545 master-0 kubenswrapper[4010]: I0319 09:19:17.330889 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.331628 master-0 kubenswrapper[4010]: I0319 09:19:17.331024 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.331628 master-0 kubenswrapper[4010]: I0319 09:19:17.331351 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.331628 master-0 kubenswrapper[4010]: I0319 09:19:17.331367 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.331719 master-0 kubenswrapper[4010]: I0319 09:19:17.331698 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-log-socket" (OuterVolumeSpecName: "log-socket") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.331785 master-0 kubenswrapper[4010]: I0319 09:19:17.331767 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.331876 master-0 kubenswrapper[4010]: I0319 09:19:17.331864 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-log-socket\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.331969 master-0 kubenswrapper[4010]: I0319 09:19:17.331958 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-slash\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.332639 master-0 kubenswrapper[4010]: I0319 09:19:17.332155 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-config\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.332748 master-0 kubenswrapper[4010]: I0319 09:19:17.332736 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-bin\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.332897 master-0 kubenswrapper[4010]: I0319 09:19:17.332101 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-slash" (OuterVolumeSpecName: "host-slash") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.332977 master-0 kubenswrapper[4010]: I0319 09:19:17.332609 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:19:17.333050 master-0 kubenswrapper[4010]: I0319 09:19:17.332832 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.333144 master-0 kubenswrapper[4010]: I0319 09:19:17.333132 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovn-node-metrics-cert\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.333556 master-0 kubenswrapper[4010]: I0319 09:19:17.333542 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-node-log\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.333651 master-0 kubenswrapper[4010]: I0319 09:19:17.333639 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-kubelet\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.333781 master-0 kubenswrapper[4010]: I0319 09:19:17.333771 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-openvswitch\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.333871 master-0 kubenswrapper[4010]: I0319 09:19:17.333858 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.334007 master-0 kubenswrapper[4010]: I0319 09:19:17.333767 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-node-log" (OuterVolumeSpecName: "node-log") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.334063 master-0 kubenswrapper[4010]: I0319 09:19:17.333771 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.334063 master-0 kubenswrapper[4010]: I0319 09:19:17.333830 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.334063 master-0 kubenswrapper[4010]: I0319 09:19:17.333950 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.334063 master-0 kubenswrapper[4010]: I0319 09:19:17.333983 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-env-overrides\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.334190 master-0 kubenswrapper[4010]: I0319 09:19:17.334079 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-ovn-kubernetes\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.334190 master-0 kubenswrapper[4010]: I0319 09:19:17.334100 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-netns\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.334190 master-0 kubenswrapper[4010]: I0319 09:19:17.334161 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.334368 master-0 kubenswrapper[4010]: I0319 09:19:17.334192 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-systemd\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.334368 master-0 kubenswrapper[4010]: I0319 09:19:17.334226 4010 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-script-lib\") pod \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\" (UID: \"57ac3af0-7d16-4715-9afa-6e98a2777e6e\") " Mar 19 09:19:17.334368 master-0 kubenswrapper[4010]: I0319 09:19:17.334190 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.334760 master-0 kubenswrapper[4010]: I0319 09:19:17.334734 4010 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-node-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334825 master-0 kubenswrapper[4010]: I0319 09:19:17.334781 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:19:17.334825 master-0 kubenswrapper[4010]: I0319 09:19:17.334807 4010 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334825 master-0 kubenswrapper[4010]: I0319 09:19:17.334824 4010 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334835 4010 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334855 4010 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334864 4010 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334872 4010 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334879 4010 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334887 4010 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334896 4010 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334904 4010 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334913 4010 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334921 4010 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.334928 master-0 kubenswrapper[4010]: I0319 09:19:17.334931 4010 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.335187 master-0 kubenswrapper[4010]: I0319 09:19:17.334940 4010 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.335484 master-0 kubenswrapper[4010]: I0319 09:19:17.335452 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:19:17.336269 master-0 kubenswrapper[4010]: I0319 09:19:17.336249 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:19:17.336359 master-0 kubenswrapper[4010]: I0319 09:19:17.336297 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ac3af0-7d16-4715-9afa-6e98a2777e6e-kube-api-access-gfqfx" (OuterVolumeSpecName: "kube-api-access-gfqfx") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "kube-api-access-gfqfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:19:17.338964 master-0 kubenswrapper[4010]: I0319 09:19:17.338927 4010 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "57ac3af0-7d16-4715-9afa-6e98a2777e6e" (UID: "57ac3af0-7d16-4715-9afa-6e98a2777e6e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:19:17.436148 master-0 kubenswrapper[4010]: I0319 09:19:17.436107 4010 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ac3af0-7d16-4715-9afa-6e98a2777e6e-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.436398 master-0 kubenswrapper[4010]: I0319 09:19:17.436379 4010 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.436569 master-0 kubenswrapper[4010]: I0319 09:19:17.436556 4010 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqfx\" (UniqueName: \"kubernetes.io/projected/57ac3af0-7d16-4715-9afa-6e98a2777e6e-kube-api-access-gfqfx\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.436643 master-0 kubenswrapper[4010]: I0319 09:19:17.436633 4010 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ac3af0-7d16-4715-9afa-6e98a2777e6e-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.436706 master-0 kubenswrapper[4010]: I0319 09:19:17.436697 4010 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ac3af0-7d16-4715-9afa-6e98a2777e6e-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 19 09:19:17.531765 master-0 kubenswrapper[4010]: I0319 09:19:17.531712 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fwjzr"] Mar 19 09:19:17.532361 master-0 kubenswrapper[4010]: E0319 09:19:17.532344 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kubecfg-setup" Mar 19 09:19:17.532437 master-0 kubenswrapper[4010]: I0319 09:19:17.532427 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kubecfg-setup" Mar 19 09:19:17.532509 master-0 kubenswrapper[4010]: E0319 09:19:17.532499 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" Mar 19 09:19:17.532576 master-0 kubenswrapper[4010]: I0319 09:19:17.532567 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" Mar 19 09:19:17.532629 master-0 kubenswrapper[4010]: E0319 09:19:17.532620 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-acl-logging" Mar 19 09:19:17.532685 master-0 kubenswrapper[4010]: I0319 09:19:17.532676 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-acl-logging" Mar 19 09:19:17.532741 master-0 kubenswrapper[4010]: E0319 09:19:17.532730 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovnkube-controller" Mar 19 09:19:17.532791 master-0 kubenswrapper[4010]: I0319 09:19:17.532782 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovnkube-controller" Mar 19 09:19:17.532868 master-0 kubenswrapper[4010]: E0319 09:19:17.532852 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-controller" Mar 19 09:19:17.532944 master-0 kubenswrapper[4010]: I0319 09:19:17.532934 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-controller" Mar 19 09:19:17.532997 master-0 kubenswrapper[4010]: E0319 09:19:17.532989 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:19:17.533042 master-0 kubenswrapper[4010]: I0319 09:19:17.533034 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:19:17.533088 master-0 kubenswrapper[4010]: E0319 09:19:17.533080 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-node" Mar 19 09:19:17.533138 master-0 kubenswrapper[4010]: I0319 09:19:17.533129 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-node" Mar 19 09:19:17.533189 master-0 kubenswrapper[4010]: E0319 09:19:17.533181 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="northd" Mar 19 09:19:17.533241 master-0 kubenswrapper[4010]: I0319 09:19:17.533232 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="northd" Mar 19 09:19:17.533299 master-0 kubenswrapper[4010]: E0319 09:19:17.533290 4010 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:19:17.533350 master-0 kubenswrapper[4010]: I0319 09:19:17.533342 4010 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:19:17.533431 master-0 kubenswrapper[4010]: I0319 09:19:17.533421 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-acl-logging" Mar 19 09:19:17.533568 master-0 kubenswrapper[4010]: I0319 09:19:17.533554 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-node" Mar 19 09:19:17.533645 master-0 kubenswrapper[4010]: I0319 09:19:17.533630 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="northd" Mar 19 09:19:17.533728 master-0 kubenswrapper[4010]: I0319 09:19:17.533716 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovnkube-controller" Mar 19 09:19:17.533796 master-0 kubenswrapper[4010]: I0319 09:19:17.533779 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="nbdb" Mar 19 09:19:17.533874 master-0 kubenswrapper[4010]: I0319 09:19:17.533864 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="ovn-controller" Mar 19 09:19:17.533937 master-0 kubenswrapper[4010]: I0319 09:19:17.533928 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:19:17.533988 master-0 kubenswrapper[4010]: I0319 09:19:17.533980 4010 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" containerName="sbdb" Mar 19 09:19:17.534644 master-0 kubenswrapper[4010]: I0319 09:19:17.534628 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638629 master-0 kubenswrapper[4010]: I0319 09:19:17.638501 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638629 master-0 kubenswrapper[4010]: I0319 09:19:17.638560 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638629 master-0 kubenswrapper[4010]: I0319 09:19:17.638586 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638629 master-0 kubenswrapper[4010]: I0319 09:19:17.638612 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638947 master-0 kubenswrapper[4010]: I0319 09:19:17.638656 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638947 master-0 kubenswrapper[4010]: I0319 09:19:17.638893 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.638947 master-0 kubenswrapper[4010]: I0319 09:19:17.638924 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639060 master-0 kubenswrapper[4010]: I0319 09:19:17.638950 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639060 master-0 kubenswrapper[4010]: I0319 09:19:17.638972 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639060 master-0 kubenswrapper[4010]: I0319 09:19:17.638992 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639060 master-0 kubenswrapper[4010]: I0319 09:19:17.639013 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639060 master-0 kubenswrapper[4010]: I0319 09:19:17.639043 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639064 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639083 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639133 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639155 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639175 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639196 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639223 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.639244 master-0 kubenswrapper[4010]: I0319 09:19:17.639243 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.740789 master-0 kubenswrapper[4010]: I0319 09:19:17.740176 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.740986 master-0 kubenswrapper[4010]: I0319 09:19:17.740346 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.740986 master-0 kubenswrapper[4010]: I0319 09:19:17.740912 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741114 master-0 kubenswrapper[4010]: I0319 09:19:17.740993 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741114 master-0 kubenswrapper[4010]: I0319 09:19:17.741019 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741114 master-0 kubenswrapper[4010]: I0319 09:19:17.741029 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741114 master-0 kubenswrapper[4010]: I0319 09:19:17.741054 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741114 master-0 kubenswrapper[4010]: I0319 09:19:17.741072 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741114 master-0 kubenswrapper[4010]: I0319 09:19:17.741096 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741297 master-0 kubenswrapper[4010]: I0319 09:19:17.741141 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741297 master-0 kubenswrapper[4010]: I0319 09:19:17.741167 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741297 master-0 kubenswrapper[4010]: I0319 09:19:17.741184 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741390 master-0 kubenswrapper[4010]: I0319 09:19:17.741324 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741491 master-0 kubenswrapper[4010]: I0319 09:19:17.741442 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741524 master-0 kubenswrapper[4010]: I0319 09:19:17.741501 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741559 master-0 kubenswrapper[4010]: I0319 09:19:17.741530 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741597 master-0 kubenswrapper[4010]: I0319 09:19:17.741555 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741597 master-0 kubenswrapper[4010]: I0319 09:19:17.741581 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741661 master-0 kubenswrapper[4010]: I0319 09:19:17.741603 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741661 master-0 kubenswrapper[4010]: I0319 09:19:17.741642 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741720 master-0 kubenswrapper[4010]: I0319 09:19:17.741655 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741720 master-0 kubenswrapper[4010]: I0319 09:19:17.741684 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741720 master-0 kubenswrapper[4010]: I0319 09:19:17.741709 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741818 master-0 kubenswrapper[4010]: I0319 09:19:17.741761 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741905 master-0 kubenswrapper[4010]: I0319 09:19:17.741870 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.741905 master-0 kubenswrapper[4010]: I0319 09:19:17.741893 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742048 master-0 kubenswrapper[4010]: I0319 09:19:17.741960 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742090 master-0 kubenswrapper[4010]: I0319 09:19:17.742058 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742090 master-0 kubenswrapper[4010]: I0319 09:19:17.742069 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742151 master-0 kubenswrapper[4010]: I0319 09:19:17.742096 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742151 master-0 kubenswrapper[4010]: I0319 09:19:17.742098 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742151 master-0 kubenswrapper[4010]: I0319 09:19:17.742120 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742151 master-0 kubenswrapper[4010]: I0319 09:19:17.742131 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742151 master-0 kubenswrapper[4010]: I0319 09:19:17.742143 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742295 master-0 kubenswrapper[4010]: I0319 09:19:17.742162 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742295 master-0 kubenswrapper[4010]: I0319 09:19:17.742202 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742295 master-0 kubenswrapper[4010]: I0319 09:19:17.742222 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.742566 master-0 kubenswrapper[4010]: I0319 09:19:17.742530 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.746890 master-0 kubenswrapper[4010]: I0319 09:19:17.746849 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.762640 master-0 kubenswrapper[4010]: I0319 09:19:17.762571 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.847578 master-0 kubenswrapper[4010]: I0319 09:19:17.847436 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:17.858436 master-0 kubenswrapper[4010]: W0319 09:19:17.858397 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96902651_8e2b_44c2_be80_0a8c7c28cb58.slice/crio-0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56 WatchSource:0}: Error finding container 0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56: Status 404 returned error can't find the container with id 0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56 Mar 19 09:19:18.100665 master-0 kubenswrapper[4010]: I0319 09:19:18.100119 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovnkube-controller/0.log" Mar 19 09:19:18.103035 master-0 kubenswrapper[4010]: I0319 09:19:18.102959 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 09:19:18.103418 master-0 kubenswrapper[4010]: I0319 09:19:18.103347 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/kube-rbac-proxy-node/0.log" Mar 19 09:19:18.104212 master-0 kubenswrapper[4010]: I0319 09:19:18.104177 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-acl-logging/0.log" Mar 19 09:19:18.104644 master-0 kubenswrapper[4010]: I0319 09:19:18.104626 4010 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-22clf_57ac3af0-7d16-4715-9afa-6e98a2777e6e/ovn-controller/0.log" Mar 19 09:19:18.105644 master-0 kubenswrapper[4010]: I0319 09:19:18.105547 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" event={"ID":"57ac3af0-7d16-4715-9afa-6e98a2777e6e","Type":"ContainerDied","Data":"c283b976ccff2f081d129aad2281421561a14a7be4a6f3749d2de0cb2ccb0b0b"} Mar 19 09:19:18.105741 master-0 kubenswrapper[4010]: I0319 09:19:18.105593 4010 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-22clf" Mar 19 09:19:18.105779 master-0 kubenswrapper[4010]: I0319 09:19:18.105676 4010 scope.go:117] "RemoveContainer" containerID="6ea06de7738f83b57506ea579d777a2ca15e923b2e7199e3db11647217382dcc" Mar 19 09:19:18.107574 master-0 kubenswrapper[4010]: I0319 09:19:18.107030 4010 generic.go:334] "Generic (PLEG): container finished" podID="96902651-8e2b-44c2-be80-0a8c7c28cb58" containerID="df60facd7b253794e244b5462531d7a854ab92c89e6e7a5b56683d4b99824cfc" exitCode=0 Mar 19 09:19:18.107574 master-0 kubenswrapper[4010]: I0319 09:19:18.107093 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerDied","Data":"df60facd7b253794e244b5462531d7a854ab92c89e6e7a5b56683d4b99824cfc"} Mar 19 09:19:18.107574 master-0 kubenswrapper[4010]: I0319 09:19:18.107143 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56"} Mar 19 09:19:18.124781 master-0 kubenswrapper[4010]: I0319 09:19:18.124708 4010 scope.go:117] "RemoveContainer" containerID="37bcda8d9f7bab92a4afbfc6cd596c04864e2ab2719b4011e61ad73b5a2801a1" Mar 19 09:19:18.137575 master-0 kubenswrapper[4010]: I0319 09:19:18.137536 4010 scope.go:117] "RemoveContainer" containerID="eec1da6d81bddf16e16ed94f28148464aa9f4fb02490c8510d9968c8ccc4f75e" Mar 19 09:19:18.150690 master-0 kubenswrapper[4010]: I0319 09:19:18.150650 4010 scope.go:117] "RemoveContainer" containerID="4f4c4c7bacb4c82526cc0b717400eed5575e7255b8fc41ce95e4db61be21ac21" Mar 19 09:19:18.166217 master-0 kubenswrapper[4010]: I0319 09:19:18.166172 4010 scope.go:117] "RemoveContainer" containerID="f7e4a347ce2589cef46085f03e6a3c4fbde90d30b9b65bf4f08fba449a466100" Mar 19 09:19:18.195125 master-0 kubenswrapper[4010]: I0319 09:19:18.194806 4010 scope.go:117] "RemoveContainer" containerID="201c6766340c20cd2107ef0e5bda47bb093b5e2cb7e924c58384097520fb652a" Mar 19 09:19:18.205130 master-0 kubenswrapper[4010]: I0319 09:19:18.205089 4010 scope.go:117] "RemoveContainer" containerID="1980c8d39693f82bcc98b960571af6d122ab09b758b31f3eb0468f4d6840dac2" Mar 19 09:19:18.216542 master-0 kubenswrapper[4010]: I0319 09:19:18.216498 4010 scope.go:117] "RemoveContainer" containerID="0486d5600b5ffa9af48869cd91a73a55fed8856efb519934adb831f0de1d5b12" Mar 19 09:19:18.226682 master-0 kubenswrapper[4010]: I0319 09:19:18.226607 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:18.226860 master-0 kubenswrapper[4010]: E0319 09:19:18.226804 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:18.227829 master-0 kubenswrapper[4010]: I0319 09:19:18.227799 4010 scope.go:117] "RemoveContainer" containerID="f3f14910b909c8727132f1ac9221bf9b5690b3430ea17e089e4840574b473f78" Mar 19 09:19:18.256839 master-0 kubenswrapper[4010]: I0319 09:19:18.256785 4010 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-22clf"] Mar 19 09:19:18.265128 master-0 kubenswrapper[4010]: I0319 09:19:18.265097 4010 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-22clf"] Mar 19 09:19:19.116985 master-0 kubenswrapper[4010]: I0319 09:19:19.116915 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"5e77e522c97f4f2d94d386381eb8e82a42def5887b72b63eb82df6b88d4da4bf"} Mar 19 09:19:19.116985 master-0 kubenswrapper[4010]: I0319 09:19:19.116960 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"9f552ae6317d0985acf653deb851508e5a40ca64804b0e7942e2ae0a60da3060"} Mar 19 09:19:19.116985 master-0 kubenswrapper[4010]: I0319 09:19:19.116971 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"d29e0194aae6a332dd4d6548509de8a6916b98cb86cb49b2b25d6c9da2a044b3"} Mar 19 09:19:19.116985 master-0 kubenswrapper[4010]: I0319 09:19:19.116983 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"125318c256540247f103fbfa7fb8f9a02d81aca742a33eab7d7a463d88cfa573"} Mar 19 09:19:19.116985 master-0 kubenswrapper[4010]: I0319 09:19:19.116993 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"641fc4ac17f6338e8067e3b7d13d8176a8b3af34d72dae917f18490fabcf71cc"} Mar 19 09:19:19.117884 master-0 kubenswrapper[4010]: I0319 09:19:19.117004 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"2a1a8b3d889a1ddb57d9d8f73d56b5bbeadbc7da708caf073e333a61f788d808"} Mar 19 09:19:19.227059 master-0 kubenswrapper[4010]: I0319 09:19:19.226966 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:19.227354 master-0 kubenswrapper[4010]: E0319 09:19:19.227110 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:19.230987 master-0 kubenswrapper[4010]: I0319 09:19:19.230940 4010 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ac3af0-7d16-4715-9afa-6e98a2777e6e" path="/var/lib/kubelet/pods/57ac3af0-7d16-4715-9afa-6e98a2777e6e/volumes" Mar 19 09:19:20.226719 master-0 kubenswrapper[4010]: I0319 09:19:20.226530 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:20.227187 master-0 kubenswrapper[4010]: E0319 09:19:20.226749 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:21.127850 master-0 kubenswrapper[4010]: I0319 09:19:21.127533 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"90bb07ebce987cbcb97bbea34873d247ffe7efb491cd619af8471e705513e6f7"} Mar 19 09:19:21.226399 master-0 kubenswrapper[4010]: I0319 09:19:21.226337 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:21.227305 master-0 kubenswrapper[4010]: E0319 09:19:21.227266 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:21.510491 master-0 kubenswrapper[4010]: E0319 09:19:21.510356 4010 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:21.673803 master-0 kubenswrapper[4010]: I0319 09:19:21.673761 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:21.674136 master-0 kubenswrapper[4010]: E0319 09:19:21.674118 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:19:21.674245 master-0 kubenswrapper[4010]: E0319 09:19:21.674235 4010 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:19:21.674308 master-0 kubenswrapper[4010]: E0319 09:19:21.674298 4010 projected.go:194] Error preparing data for projected volume kube-api-access-wrs54 for pod openshift-network-diagnostics/network-check-target-95w9b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:21.674414 master-0 kubenswrapper[4010]: E0319 09:19:21.674402 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54 podName:307605e6-d1cf-4172-8e7d-918c435f3577 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:25.674387351 +0000 UTC m=+205.200331948 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrs54" (UniqueName: "kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54") pod "network-check-target-95w9b" (UID: "307605e6-d1cf-4172-8e7d-918c435f3577") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:19:22.226718 master-0 kubenswrapper[4010]: I0319 09:19:22.226591 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:22.227041 master-0 kubenswrapper[4010]: E0319 09:19:22.226767 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:23.227367 master-0 kubenswrapper[4010]: I0319 09:19:23.227286 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:23.227989 master-0 kubenswrapper[4010]: E0319 09:19:23.227462 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:24.145754 master-0 kubenswrapper[4010]: I0319 09:19:24.145675 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" event={"ID":"96902651-8e2b-44c2-be80-0a8c7c28cb58","Type":"ContainerStarted","Data":"f82f9b92057673da988b4d159145e65c92c5a597e81adf7f835dbebd722df35d"} Mar 19 09:19:24.146857 master-0 kubenswrapper[4010]: I0319 09:19:24.146183 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:24.146857 master-0 kubenswrapper[4010]: I0319 09:19:24.146234 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:24.166709 master-0 kubenswrapper[4010]: I0319 09:19:24.166637 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:24.226599 master-0 kubenswrapper[4010]: I0319 09:19:24.226462 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:24.226987 master-0 kubenswrapper[4010]: E0319 09:19:24.226727 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:24.715503 master-0 kubenswrapper[4010]: I0319 09:19:24.715389 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" podStartSLOduration=7.715369809 podStartE2EDuration="7.715369809s" podCreationTimestamp="2026-03-19 09:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:19:24.683765037 +0000 UTC m=+144.209709664" watchObservedRunningTime="2026-03-19 09:19:24.715369809 +0000 UTC m=+144.241314416" Mar 19 09:19:25.148511 master-0 kubenswrapper[4010]: I0319 09:19:25.148300 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:25.179422 master-0 kubenswrapper[4010]: I0319 09:19:25.179338 4010 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:25.227592 master-0 kubenswrapper[4010]: I0319 09:19:25.227442 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:25.227827 master-0 kubenswrapper[4010]: E0319 09:19:25.227754 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:25.824715 master-0 kubenswrapper[4010]: I0319 09:19:25.823849 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-95w9b"] Mar 19 09:19:25.837051 master-0 kubenswrapper[4010]: I0319 09:19:25.832962 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p76jz"] Mar 19 09:19:25.837051 master-0 kubenswrapper[4010]: I0319 09:19:25.833239 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:25.837051 master-0 kubenswrapper[4010]: E0319 09:19:25.833377 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:26.151456 master-0 kubenswrapper[4010]: I0319 09:19:26.151319 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:26.151456 master-0 kubenswrapper[4010]: E0319 09:19:26.151442 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:26.512063 master-0 kubenswrapper[4010]: E0319 09:19:26.511985 4010 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:19:27.227303 master-0 kubenswrapper[4010]: I0319 09:19:27.226813 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:27.228125 master-0 kubenswrapper[4010]: E0319 09:19:27.227439 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:28.226838 master-0 kubenswrapper[4010]: I0319 09:19:28.226769 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:28.227048 master-0 kubenswrapper[4010]: E0319 09:19:28.226935 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:29.226750 master-0 kubenswrapper[4010]: I0319 09:19:29.226677 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:29.227357 master-0 kubenswrapper[4010]: E0319 09:19:29.226868 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:30.227324 master-0 kubenswrapper[4010]: I0319 09:19:30.227238 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:30.227913 master-0 kubenswrapper[4010]: E0319 09:19:30.227402 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-95w9b" podUID="307605e6-d1cf-4172-8e7d-918c435f3577" Mar 19 09:19:31.227295 master-0 kubenswrapper[4010]: I0319 09:19:31.226964 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:31.227891 master-0 kubenswrapper[4010]: E0319 09:19:31.227819 4010 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:19:32.226829 master-0 kubenswrapper[4010]: I0319 09:19:32.226740 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:32.229238 master-0 kubenswrapper[4010]: I0319 09:19:32.229193 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:19:32.229649 master-0 kubenswrapper[4010]: I0319 09:19:32.229314 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:19:33.226529 master-0 kubenswrapper[4010]: I0319 09:19:33.226449 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:33.228752 master-0 kubenswrapper[4010]: I0319 09:19:33.228713 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:19:35.924649 master-0 kubenswrapper[4010]: I0319 09:19:35.924606 4010 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 19 09:19:35.962906 master-0 kubenswrapper[4010]: I0319 09:19:35.962857 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498"] Mar 19 09:19:35.963327 master-0 kubenswrapper[4010]: I0319 09:19:35.963291 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:35.966008 master-0 kubenswrapper[4010]: I0319 09:19:35.965961 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:19:35.966440 master-0 kubenswrapper[4010]: I0319 09:19:35.966403 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:19:35.966626 master-0 kubenswrapper[4010]: I0319 09:19:35.966588 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:19:35.966673 master-0 kubenswrapper[4010]: I0319 09:19:35.966588 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:19:35.966673 master-0 kubenswrapper[4010]: I0319 09:19:35.966664 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:19:35.966739 master-0 kubenswrapper[4010]: I0319 09:19:35.966658 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:19:35.966975 master-0 kubenswrapper[4010]: I0319 09:19:35.966947 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:19:35.969741 master-0 kubenswrapper[4010]: I0319 09:19:35.969711 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-fdxtp"] Mar 19 09:19:35.970221 master-0 kubenswrapper[4010]: I0319 09:19:35.970169 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:35.970736 master-0 kubenswrapper[4010]: I0319 09:19:35.970290 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9"] Mar 19 09:19:35.970736 master-0 kubenswrapper[4010]: I0319 09:19:35.970522 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974501 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb"] Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974693 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974739 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974787 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974815 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974922 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974942 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974968 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.975037 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p"] Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.974988 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:19:35.976131 master-0 kubenswrapper[4010]: I0319 09:19:35.975372 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:35.977825 master-0 kubenswrapper[4010]: I0319 09:19:35.976624 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc"] Mar 19 09:19:35.977825 master-0 kubenswrapper[4010]: I0319 09:19:35.976985 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp"] Mar 19 09:19:35.977825 master-0 kubenswrapper[4010]: I0319 09:19:35.977068 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:35.977825 master-0 kubenswrapper[4010]: I0319 09:19:35.977420 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:35.978642 master-0 kubenswrapper[4010]: I0319 09:19:35.978608 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx"] Mar 19 09:19:35.978972 master-0 kubenswrapper[4010]: I0319 09:19:35.978940 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:35.979324 master-0 kubenswrapper[4010]: I0319 09:19:35.979303 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:19:35.979417 master-0 kubenswrapper[4010]: I0319 09:19:35.979391 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.980123 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.980609 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.980851 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.981180 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.981727 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869"] Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.981801 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.981976 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.981988 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw"] Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.982084 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.982132 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd"] Mar 19 09:19:35.982521 master-0 kubenswrapper[4010]: I0319 09:19:35.982378 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:35.982833 master-0 kubenswrapper[4010]: I0319 09:19:35.982665 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:19:35.982833 master-0 kubenswrapper[4010]: I0319 09:19:35.982775 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:35.982989 master-0 kubenswrapper[4010]: I0319 09:19:35.982956 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:35.984109 master-0 kubenswrapper[4010]: I0319 09:19:35.984078 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5"] Mar 19 09:19:35.984658 master-0 kubenswrapper[4010]: I0319 09:19:35.984632 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2"] Mar 19 09:19:35.985088 master-0 kubenswrapper[4010]: I0319 09:19:35.985071 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:35.985186 master-0 kubenswrapper[4010]: I0319 09:19:35.984671 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.986062 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-6qck2"] Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.986719 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz"] Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.987025 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.987130 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.987384 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.987445 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb"] Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.987758 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.987870 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.988236 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.988241 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:19:35.988796 master-0 kubenswrapper[4010]: I0319 09:19:35.988556 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk"] Mar 19 09:19:35.989112 master-0 kubenswrapper[4010]: I0319 09:19:35.988885 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:35.989706 master-0 kubenswrapper[4010]: I0319 09:19:35.989627 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6"] Mar 19 09:19:35.990444 master-0 kubenswrapper[4010]: I0319 09:19:35.990413 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b"] Mar 19 09:19:35.991015 master-0 kubenswrapper[4010]: I0319 09:19:35.990983 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:35.991160 master-0 kubenswrapper[4010]: I0319 09:19:35.991141 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:35.992149 master-0 kubenswrapper[4010]: I0319 09:19:35.992114 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:19:35.992383 master-0 kubenswrapper[4010]: I0319 09:19:35.992298 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:19:35.992383 master-0 kubenswrapper[4010]: I0319 09:19:35.992345 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:19:36.001978 master-0 kubenswrapper[4010]: I0319 09:19:36.001937 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:19:36.002412 master-0 kubenswrapper[4010]: I0319 09:19:36.002392 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:19:36.002510 master-0 kubenswrapper[4010]: I0319 09:19:36.002456 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx"] Mar 19 09:19:36.002597 master-0 kubenswrapper[4010]: I0319 09:19:36.002567 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:19:36.002887 master-0 kubenswrapper[4010]: I0319 09:19:36.002866 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:19:36.003319 master-0 kubenswrapper[4010]: I0319 09:19:36.003297 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.004596 master-0 kubenswrapper[4010]: I0319 09:19:36.004542 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.016760 master-0 kubenswrapper[4010]: I0319 09:19:36.016358 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:19:36.017686 master-0 kubenswrapper[4010]: I0319 09:19:36.017589 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq"] Mar 19 09:19:36.023499 master-0 kubenswrapper[4010]: I0319 09:19:36.018484 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm"] Mar 19 09:19:36.023499 master-0 kubenswrapper[4010]: I0319 09:19:36.018772 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.023499 master-0 kubenswrapper[4010]: I0319 09:19:36.020729 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:19:36.023499 master-0 kubenswrapper[4010]: I0319 09:19:36.020992 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.023805 master-0 kubenswrapper[4010]: I0319 09:19:36.021003 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.024361 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021092 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021125 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021204 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021235 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021560 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.025106 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.025352 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021614 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021648 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.021681 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022189 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022229 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022266 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022295 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022325 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022350 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022856 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022886 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.022908 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:19:36.027117 master-0 kubenswrapper[4010]: I0319 09:19:36.023712 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:19:36.032077 master-0 kubenswrapper[4010]: I0319 09:19:36.030538 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498"] Mar 19 09:19:36.032077 master-0 kubenswrapper[4010]: I0319 09:19:36.030599 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx"] Mar 19 09:19:36.032077 master-0 kubenswrapper[4010]: I0319 09:19:36.030864 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb"] Mar 19 09:19:36.032077 master-0 kubenswrapper[4010]: I0319 09:19:36.031646 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc"] Mar 19 09:19:36.032289 master-0 kubenswrapper[4010]: I0319 09:19:36.032204 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9"] Mar 19 09:19:36.036091 master-0 kubenswrapper[4010]: I0319 09:19:36.035438 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2"] Mar 19 09:19:36.036091 master-0 kubenswrapper[4010]: I0319 09:19:36.035512 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5"] Mar 19 09:19:36.036620 master-0 kubenswrapper[4010]: I0319 09:19:36.036582 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:19:36.036620 master-0 kubenswrapper[4010]: I0319 09:19:36.036601 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:19:36.037086 master-0 kubenswrapper[4010]: I0319 09:19:36.037058 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:19:36.037163 master-0 kubenswrapper[4010]: I0319 09:19:36.037134 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037203 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037222 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037306 4010 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037386 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037500 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037501 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037558 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:19:36.037600 master-0 kubenswrapper[4010]: I0319 09:19:36.037570 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.037857 master-0 kubenswrapper[4010]: I0319 09:19:36.037506 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.043443 master-0 kubenswrapper[4010]: I0319 09:19:36.040060 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:19:36.043682 master-0 kubenswrapper[4010]: I0319 09:19:36.043585 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:19:36.046181 master-0 kubenswrapper[4010]: I0319 09:19:36.046136 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869"] Mar 19 09:19:36.046987 master-0 kubenswrapper[4010]: I0319 09:19:36.046925 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-6qck2"] Mar 19 09:19:36.047558 master-0 kubenswrapper[4010]: I0319 09:19:36.047455 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:19:36.047692 master-0 kubenswrapper[4010]: I0319 09:19:36.047658 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-fdxtp"] Mar 19 09:19:36.049703 master-0 kubenswrapper[4010]: I0319 09:19:36.048847 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw"] Mar 19 09:19:36.053876 master-0 kubenswrapper[4010]: I0319 09:19:36.053051 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6"] Mar 19 09:19:36.058447 master-0 kubenswrapper[4010]: I0319 09:19:36.058363 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq"] Mar 19 09:19:36.062706 master-0 kubenswrapper[4010]: I0319 09:19:36.062579 4010 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-2s58d"] Mar 19 09:19:36.063954 master-0 kubenswrapper[4010]: I0319 09:19:36.063910 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz"] Mar 19 09:19:36.064107 master-0 kubenswrapper[4010]: I0319 09:19:36.064057 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.068345 master-0 kubenswrapper[4010]: I0319 09:19:36.068310 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd"] Mar 19 09:19:36.069202 master-0 kubenswrapper[4010]: I0319 09:19:36.069138 4010 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:19:36.073007 master-0 kubenswrapper[4010]: I0319 09:19:36.072964 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm"] Mar 19 09:19:36.075072 master-0 kubenswrapper[4010]: I0319 09:19:36.075010 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx"] Mar 19 09:19:36.077689 master-0 kubenswrapper[4010]: I0319 09:19:36.077638 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk"] Mar 19 09:19:36.077965 master-0 kubenswrapper[4010]: I0319 09:19:36.077721 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp"] Mar 19 09:19:36.077965 master-0 kubenswrapper[4010]: I0319 09:19:36.077740 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p"] Mar 19 09:19:36.078920 master-0 kubenswrapper[4010]: I0319 09:19:36.078890 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b"] Mar 19 09:19:36.079797 master-0 kubenswrapper[4010]: I0319 09:19:36.079765 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb"] Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.083872 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.083917 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.083946 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.083994 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084053 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084107 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084142 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084184 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084211 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084238 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.084255 master-0 kubenswrapper[4010]: I0319 09:19:36.084262 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084296 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084333 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084375 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084402 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084426 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084448 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084494 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084529 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084563 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084590 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084614 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:36.084666 master-0 kubenswrapper[4010]: I0319 09:19:36.084641 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.084953 master-0 kubenswrapper[4010]: I0319 09:19:36.084686 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.084953 master-0 kubenswrapper[4010]: I0319 09:19:36.084749 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.085127 master-0 kubenswrapper[4010]: I0319 09:19:36.084875 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.085166 master-0 kubenswrapper[4010]: I0319 09:19:36.085140 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.085193 master-0 kubenswrapper[4010]: I0319 09:19:36.085167 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.085228 master-0 kubenswrapper[4010]: I0319 09:19:36.085190 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.085228 master-0 kubenswrapper[4010]: I0319 09:19:36.085214 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.085410 master-0 kubenswrapper[4010]: I0319 09:19:36.085238 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.085410 master-0 kubenswrapper[4010]: I0319 09:19:36.085367 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.085507 master-0 kubenswrapper[4010]: I0319 09:19:36.085447 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:36.085507 master-0 kubenswrapper[4010]: I0319 09:19:36.085490 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.085583 master-0 kubenswrapper[4010]: I0319 09:19:36.085515 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.085583 master-0 kubenswrapper[4010]: I0319 09:19:36.085540 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.085583 master-0 kubenswrapper[4010]: I0319 09:19:36.085565 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.085682 master-0 kubenswrapper[4010]: I0319 09:19:36.085593 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:36.085682 master-0 kubenswrapper[4010]: I0319 09:19:36.085624 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.085682 master-0 kubenswrapper[4010]: I0319 09:19:36.085646 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.085803 master-0 kubenswrapper[4010]: I0319 09:19:36.085669 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:36.085803 master-0 kubenswrapper[4010]: I0319 09:19:36.085714 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.085803 master-0 kubenswrapper[4010]: I0319 09:19:36.085745 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.085902 master-0 kubenswrapper[4010]: I0319 09:19:36.085829 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.085944 master-0 kubenswrapper[4010]: I0319 09:19:36.085918 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.086029 master-0 kubenswrapper[4010]: I0319 09:19:36.085951 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.086349 master-0 kubenswrapper[4010]: I0319 09:19:36.086276 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.086391 master-0 kubenswrapper[4010]: I0319 09:19:36.086369 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.086437 master-0 kubenswrapper[4010]: I0319 09:19:36.086411 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.086517 master-0 kubenswrapper[4010]: I0319 09:19:36.086480 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.086557 master-0 kubenswrapper[4010]: I0319 09:19:36.086527 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.086677 master-0 kubenswrapper[4010]: I0319 09:19:36.086608 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.086721 master-0 kubenswrapper[4010]: I0319 09:19:36.086694 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:36.086781 master-0 kubenswrapper[4010]: I0319 09:19:36.086762 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.086837 master-0 kubenswrapper[4010]: I0319 09:19:36.086797 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.086927 master-0 kubenswrapper[4010]: I0319 09:19:36.086868 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.087012 master-0 kubenswrapper[4010]: I0319 09:19:36.086988 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.087066 master-0 kubenswrapper[4010]: I0319 09:19:36.087024 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087075 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087201 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087259 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087285 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087338 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087372 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087573 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087603 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087655 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087751 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087805 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.088580 master-0 kubenswrapper[4010]: I0319 09:19:36.087832 4010 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.189107 master-0 kubenswrapper[4010]: I0319 09:19:36.189055 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.189560 master-0 kubenswrapper[4010]: I0319 09:19:36.189506 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.189704 master-0 kubenswrapper[4010]: I0319 09:19:36.189667 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.189911 master-0 kubenswrapper[4010]: I0319 09:19:36.189866 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.189969 master-0 kubenswrapper[4010]: I0319 09:19:36.189912 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.189969 master-0 kubenswrapper[4010]: I0319 09:19:36.189939 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.190073 master-0 kubenswrapper[4010]: I0319 09:19:36.189978 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.190073 master-0 kubenswrapper[4010]: I0319 09:19:36.190018 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.190073 master-0 kubenswrapper[4010]: I0319 09:19:36.190053 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.190212 master-0 kubenswrapper[4010]: I0319 09:19:36.190078 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.190212 master-0 kubenswrapper[4010]: I0319 09:19:36.190105 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.190212 master-0 kubenswrapper[4010]: I0319 09:19:36.190129 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.190212 master-0 kubenswrapper[4010]: I0319 09:19:36.190154 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.190212 master-0 kubenswrapper[4010]: I0319 09:19:36.190181 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:36.190212 master-0 kubenswrapper[4010]: I0319 09:19:36.190202 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:36.190417 master-0 kubenswrapper[4010]: I0319 09:19:36.190226 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.190417 master-0 kubenswrapper[4010]: I0319 09:19:36.190252 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:36.190417 master-0 kubenswrapper[4010]: I0319 09:19:36.190276 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:36.190417 master-0 kubenswrapper[4010]: I0319 09:19:36.190301 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.190417 master-0 kubenswrapper[4010]: I0319 09:19:36.190334 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.191186 master-0 kubenswrapper[4010]: I0319 09:19:36.191119 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.191289 master-0 kubenswrapper[4010]: I0319 09:19:36.191245 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.191587 master-0 kubenswrapper[4010]: I0319 09:19:36.191538 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.191658 master-0 kubenswrapper[4010]: I0319 09:19:36.191597 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.191738 master-0 kubenswrapper[4010]: E0319 09:19:36.191603 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:36.191779 master-0 kubenswrapper[4010]: E0319 09:19:36.191769 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.691753964 +0000 UTC m=+156.217698571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:36.191867 master-0 kubenswrapper[4010]: I0319 09:19:36.191841 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.191934 master-0 kubenswrapper[4010]: E0319 09:19:36.191711 4010 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:36.191934 master-0 kubenswrapper[4010]: E0319 09:19:36.191913 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.691904708 +0000 UTC m=+156.217849405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:36.192075 master-0 kubenswrapper[4010]: E0319 09:19:36.192050 4010 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:36.192120 master-0 kubenswrapper[4010]: E0319 09:19:36.192088 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.692079892 +0000 UTC m=+156.218024619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:36.192166 master-0 kubenswrapper[4010]: I0319 09:19:36.192115 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.192214 master-0 kubenswrapper[4010]: I0319 09:19:36.192167 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.192262 master-0 kubenswrapper[4010]: I0319 09:19:36.192185 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.192301 master-0 kubenswrapper[4010]: I0319 09:19:36.192281 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.192340 master-0 kubenswrapper[4010]: I0319 09:19:36.192306 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.192340 master-0 kubenswrapper[4010]: I0319 09:19:36.192331 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.192414 master-0 kubenswrapper[4010]: I0319 09:19:36.192352 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.192414 master-0 kubenswrapper[4010]: I0319 09:19:36.192388 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.192534 master-0 kubenswrapper[4010]: I0319 09:19:36.192433 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:36.192534 master-0 kubenswrapper[4010]: I0319 09:19:36.192457 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.192534 master-0 kubenswrapper[4010]: I0319 09:19:36.192497 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.192534 master-0 kubenswrapper[4010]: I0319 09:19:36.192519 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:36.192534 master-0 kubenswrapper[4010]: I0319 09:19:36.192538 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.192705 master-0 kubenswrapper[4010]: I0319 09:19:36.192557 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.192705 master-0 kubenswrapper[4010]: I0319 09:19:36.192578 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:36.192705 master-0 kubenswrapper[4010]: I0319 09:19:36.192596 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.192705 master-0 kubenswrapper[4010]: I0319 09:19:36.192667 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.192705 master-0 kubenswrapper[4010]: I0319 09:19:36.192701 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192732 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192756 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192776 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192794 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192814 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192820 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192834 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192855 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192873 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.192891 master-0 kubenswrapper[4010]: I0319 09:19:36.192895 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.192919 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.192940 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.192961 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: E0319 09:19:36.192954 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.192985 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: E0319 09:19:36.193026 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.693007076 +0000 UTC m=+156.218951773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.193075 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.193105 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.193139 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.193164 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.193188 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.193238 master-0 kubenswrapper[4010]: I0319 09:19:36.193209 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.193691 master-0 kubenswrapper[4010]: I0319 09:19:36.193300 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:36.193691 master-0 kubenswrapper[4010]: I0319 09:19:36.193338 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.193691 master-0 kubenswrapper[4010]: I0319 09:19:36.193360 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.193691 master-0 kubenswrapper[4010]: I0319 09:19:36.193387 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.193691 master-0 kubenswrapper[4010]: I0319 09:19:36.193411 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.194155 master-0 kubenswrapper[4010]: I0319 09:19:36.194112 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.194216 master-0 kubenswrapper[4010]: E0319 09:19:36.194204 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:36.194257 master-0 kubenswrapper[4010]: I0319 09:19:36.194230 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.194257 master-0 kubenswrapper[4010]: E0319 09:19:36.194237 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.694227298 +0000 UTC m=+156.220171995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:36.194336 master-0 kubenswrapper[4010]: E0319 09:19:36.194321 4010 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:36.194374 master-0 kubenswrapper[4010]: E0319 09:19:36.194353 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.694342721 +0000 UTC m=+156.220287338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:36.194415 master-0 kubenswrapper[4010]: E0319 09:19:36.194401 4010 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:36.194456 master-0 kubenswrapper[4010]: E0319 09:19:36.194426 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.694419163 +0000 UTC m=+156.220363770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.194805 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.194871 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: E0319 09:19:36.194893 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: E0319 09:19:36.194939 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.694923455 +0000 UTC m=+156.220868062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195105 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195350 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195368 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.192627 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195801 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195849 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195876 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195903 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195932 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195957 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.196145 master-0 kubenswrapper[4010]: I0319 09:19:36.195980 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.196859 master-0 kubenswrapper[4010]: I0319 09:19:36.196008 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.196859 master-0 kubenswrapper[4010]: I0319 09:19:36.196032 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:36.196859 master-0 kubenswrapper[4010]: I0319 09:19:36.196812 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.197103 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.197346 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.197664 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: E0319 09:19:36.197969 4010 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.197991 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: E0319 09:19:36.198030 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.698011384 +0000 UTC m=+156.223955991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: E0319 09:19:36.198065 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: E0319 09:19:36.198145 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.698115997 +0000 UTC m=+156.224060804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: E0319 09:19:36.198232 4010 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: E0319 09:19:36.198296 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:36.698280671 +0000 UTC m=+156.224225478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.198559 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.198591 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.198612 master-0 kubenswrapper[4010]: I0319 09:19:36.198612 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.199781 master-0 kubenswrapper[4010]: I0319 09:19:36.199746 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.199844 master-0 kubenswrapper[4010]: I0319 09:19:36.199753 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.199844 master-0 kubenswrapper[4010]: I0319 09:19:36.199828 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.199941 master-0 kubenswrapper[4010]: I0319 09:19:36.199906 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.199983 master-0 kubenswrapper[4010]: I0319 09:19:36.199968 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.200610 master-0 kubenswrapper[4010]: I0319 09:19:36.200442 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.200990 master-0 kubenswrapper[4010]: I0319 09:19:36.200956 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.201902 master-0 kubenswrapper[4010]: I0319 09:19:36.201744 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.202192 master-0 kubenswrapper[4010]: I0319 09:19:36.201930 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.203261 master-0 kubenswrapper[4010]: I0319 09:19:36.203214 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.218023 master-0 kubenswrapper[4010]: I0319 09:19:36.213154 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.220115 master-0 kubenswrapper[4010]: I0319 09:19:36.220069 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.234641 master-0 kubenswrapper[4010]: I0319 09:19:36.233980 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.239530 master-0 kubenswrapper[4010]: I0319 09:19:36.237619 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.239530 master-0 kubenswrapper[4010]: I0319 09:19:36.237752 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:36.240942 master-0 kubenswrapper[4010]: I0319 09:19:36.240325 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.242934 master-0 kubenswrapper[4010]: I0319 09:19:36.242864 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:36.243084 master-0 kubenswrapper[4010]: I0319 09:19:36.243041 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.243527 master-0 kubenswrapper[4010]: I0319 09:19:36.243498 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.244091 master-0 kubenswrapper[4010]: I0319 09:19:36.244051 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:36.244161 master-0 kubenswrapper[4010]: I0319 09:19:36.244132 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.244898 master-0 kubenswrapper[4010]: I0319 09:19:36.244767 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.244967 master-0 kubenswrapper[4010]: I0319 09:19:36.244909 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.245400 master-0 kubenswrapper[4010]: I0319 09:19:36.245343 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:36.246193 master-0 kubenswrapper[4010]: I0319 09:19:36.246012 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.248883 master-0 kubenswrapper[4010]: I0319 09:19:36.247459 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.248883 master-0 kubenswrapper[4010]: I0319 09:19:36.247719 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.248883 master-0 kubenswrapper[4010]: I0319 09:19:36.247810 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.249852 master-0 kubenswrapper[4010]: I0319 09:19:36.249818 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.249905 master-0 kubenswrapper[4010]: I0319 09:19:36.249869 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.252863 master-0 kubenswrapper[4010]: I0319 09:19:36.252830 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.262553 master-0 kubenswrapper[4010]: I0319 09:19:36.262504 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:36.278736 master-0 kubenswrapper[4010]: I0319 09:19:36.278683 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:36.278914 master-0 kubenswrapper[4010]: I0319 09:19:36.278774 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:36.291846 master-0 kubenswrapper[4010]: I0319 09:19:36.291790 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.306441 master-0 kubenswrapper[4010]: I0319 09:19:36.306368 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.309662 master-0 kubenswrapper[4010]: I0319 09:19:36.309620 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:36.318187 master-0 kubenswrapper[4010]: I0319 09:19:36.317817 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:36.319993 master-0 kubenswrapper[4010]: I0319 09:19:36.319917 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:36.327018 master-0 kubenswrapper[4010]: I0319 09:19:36.326630 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:36.336182 master-0 kubenswrapper[4010]: I0319 09:19:36.336107 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:36.343961 master-0 kubenswrapper[4010]: I0319 09:19:36.343881 4010 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.348959 master-0 kubenswrapper[4010]: I0319 09:19:36.348920 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:36.361180 master-0 kubenswrapper[4010]: I0319 09:19:36.361113 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:36.382844 master-0 kubenswrapper[4010]: I0319 09:19:36.382374 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:36.416495 master-0 kubenswrapper[4010]: I0319 09:19:36.414681 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:36.437567 master-0 kubenswrapper[4010]: I0319 09:19:36.434489 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:36.473792 master-0 kubenswrapper[4010]: I0319 09:19:36.473745 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:36.493749 master-0 kubenswrapper[4010]: I0319 09:19:36.493703 4010 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:36.607711 master-0 kubenswrapper[4010]: I0319 09:19:36.605983 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498"] Mar 19 09:19:36.615205 master-0 kubenswrapper[4010]: I0319 09:19:36.614364 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk"] Mar 19 09:19:36.682356 master-0 kubenswrapper[4010]: I0319 09:19:36.682250 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9"] Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.721381 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.721706 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.721766 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.721952 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722119 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722128 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.722193 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722215 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722190092 +0000 UTC m=+157.248134759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.722285 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.722316 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722290 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722329 4010 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722361 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722381 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722370537 +0000 UTC m=+157.248315144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.722482 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722533 4010 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: E0319 09:19:36.722592 4010 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:36.724493 master-0 kubenswrapper[4010]: I0319 09:19:36.722548 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722622 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722595432 +0000 UTC m=+157.248540089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722641 4010 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722643 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722634943 +0000 UTC m=+157.248579630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722662 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722653334 +0000 UTC m=+157.248598011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722677 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722669694 +0000 UTC m=+157.248614381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722694 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722683975 +0000 UTC m=+157.248628772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: I0319 09:19:36.722724 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722767 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722757406 +0000 UTC m=+157.248702093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722797 4010 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722823 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.722814988 +0000 UTC m=+157.248759665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: I0319 09:19:36.722820 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722838 4010 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:36.725345 master-0 kubenswrapper[4010]: E0319 09:19:36.722875 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:36.725812 master-0 kubenswrapper[4010]: E0319 09:19:36.722902 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.72289479 +0000 UTC m=+157.248839457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:36.725812 master-0 kubenswrapper[4010]: E0319 09:19:36.722931 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:37.7229084 +0000 UTC m=+157.248853057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:36.766707 master-0 kubenswrapper[4010]: I0319 09:19:36.764419 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb"] Mar 19 09:19:36.782213 master-0 kubenswrapper[4010]: W0319 09:19:36.780384 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d3ca81_26e1_4625_a3aa_b1eabd31cfd6.slice/crio-53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e WatchSource:0}: Error finding container 53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e: Status 404 returned error can't find the container with id 53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e Mar 19 09:19:36.798883 master-0 kubenswrapper[4010]: I0319 09:19:36.798804 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw"] Mar 19 09:19:36.803552 master-0 kubenswrapper[4010]: I0319 09:19:36.803359 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp"] Mar 19 09:19:36.837166 master-0 kubenswrapper[4010]: I0319 09:19:36.836787 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p"] Mar 19 09:19:36.848754 master-0 kubenswrapper[4010]: I0319 09:19:36.847889 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz"] Mar 19 09:19:36.852433 master-0 kubenswrapper[4010]: W0319 09:19:36.852364 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310d604b_fe9a_4b19_b8b5_7a1983e45e67.slice/crio-c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937 WatchSource:0}: Error finding container c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937: Status 404 returned error can't find the container with id c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937 Mar 19 09:19:36.861747 master-0 kubenswrapper[4010]: I0319 09:19:36.858024 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm"] Mar 19 09:19:36.870951 master-0 kubenswrapper[4010]: I0319 09:19:36.869030 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx"] Mar 19 09:19:36.870951 master-0 kubenswrapper[4010]: I0319 09:19:36.869866 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq"] Mar 19 09:19:36.871562 master-0 kubenswrapper[4010]: W0319 09:19:36.871517 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b36f3b2_caf9_40ad_a3a1_e83796142f54.slice/crio-16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679 WatchSource:0}: Error finding container 16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679: Status 404 returned error can't find the container with id 16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679 Mar 19 09:19:36.874082 master-0 kubenswrapper[4010]: W0319 09:19:36.873771 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf216606b_43d0_43d0_a3e3_a3ee2952e7b8.slice/crio-b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e WatchSource:0}: Error finding container b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e: Status 404 returned error can't find the container with id b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e Mar 19 09:19:37.026665 master-0 kubenswrapper[4010]: I0319 09:19:37.026623 4010 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb"] Mar 19 09:19:37.044354 master-0 kubenswrapper[4010]: W0319 09:19:37.044303 4010 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode09725c2_45c6_4a60_b817_6e5316d6f8e8.slice/crio-948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f WatchSource:0}: Error finding container 948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f: Status 404 returned error can't find the container with id 948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f Mar 19 09:19:37.186414 master-0 kubenswrapper[4010]: I0319 09:19:37.186228 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" event={"ID":"357980ba-1957-412f-afb5-04281eca2bee","Type":"ContainerStarted","Data":"8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c"} Mar 19 09:19:37.189622 master-0 kubenswrapper[4010]: I0319 09:19:37.189536 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" event={"ID":"a823c8bc-09ef-46a9-a1f3-155a34b89788","Type":"ContainerStarted","Data":"7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e"} Mar 19 09:19:37.190824 master-0 kubenswrapper[4010]: I0319 09:19:37.190776 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" event={"ID":"a75049de-dcf1-4102-b339-f45d5015adea","Type":"ContainerStarted","Data":"e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57"} Mar 19 09:19:37.192404 master-0 kubenswrapper[4010]: I0319 09:19:37.191915 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" event={"ID":"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6","Type":"ContainerStarted","Data":"53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e"} Mar 19 09:19:37.193196 master-0 kubenswrapper[4010]: I0319 09:19:37.193138 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" event={"ID":"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8","Type":"ContainerStarted","Data":"821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e"} Mar 19 09:19:37.194343 master-0 kubenswrapper[4010]: I0319 09:19:37.194310 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" event={"ID":"083882c0-ea2f-4405-8cf1-cce5b91fe602","Type":"ContainerStarted","Data":"0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74"} Mar 19 09:19:37.195782 master-0 kubenswrapper[4010]: I0319 09:19:37.195733 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" event={"ID":"e09725c2-45c6-4a60-b817-6e5316d6f8e8","Type":"ContainerStarted","Data":"948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f"} Mar 19 09:19:37.206570 master-0 kubenswrapper[4010]: I0319 09:19:37.203531 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" event={"ID":"310d604b-fe9a-4b19-b8b5-7a1983e45e67","Type":"ContainerStarted","Data":"f349a28ea0bb985b97d809f46b60d5c4412444c67eeb0389e91efb0430bb6dcb"} Mar 19 09:19:37.206570 master-0 kubenswrapper[4010]: I0319 09:19:37.203576 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" event={"ID":"310d604b-fe9a-4b19-b8b5-7a1983e45e67","Type":"ContainerStarted","Data":"c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937"} Mar 19 09:19:37.206570 master-0 kubenswrapper[4010]: I0319 09:19:37.205952 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" event={"ID":"9663cc40-a69d-42ba-890e-071cb85062f5","Type":"ContainerStarted","Data":"75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc"} Mar 19 09:19:37.206928 master-0 kubenswrapper[4010]: I0319 09:19:37.206839 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2s58d" event={"ID":"bec90db1-02e3-4211-8c33-f8bcc304e3a7","Type":"ContainerStarted","Data":"583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd"} Mar 19 09:19:37.207961 master-0 kubenswrapper[4010]: I0319 09:19:37.207920 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerStarted","Data":"b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e"} Mar 19 09:19:37.208850 master-0 kubenswrapper[4010]: I0319 09:19:37.208813 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" event={"ID":"5b36f3b2-caf9-40ad-a3a1-e83796142f54","Type":"ContainerStarted","Data":"16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679"} Mar 19 09:19:37.209989 master-0 kubenswrapper[4010]: I0319 09:19:37.209957 4010 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" event={"ID":"86c4b0e4-3481-465d-b00f-022d2c58c183","Type":"ContainerStarted","Data":"a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2"} Mar 19 09:19:37.228512 master-0 kubenswrapper[4010]: I0319 09:19:37.228433 4010 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" podStartSLOduration=115.228411868 podStartE2EDuration="1m55.228411868s" podCreationTimestamp="2026-03-19 09:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:19:37.227742161 +0000 UTC m=+156.753686768" watchObservedRunningTime="2026-03-19 09:19:37.228411868 +0000 UTC m=+156.754356495" Mar 19 09:19:37.742101 master-0 kubenswrapper[4010]: I0319 09:19:37.742041 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:37.742101 master-0 kubenswrapper[4010]: I0319 09:19:37.742108 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:37.742452 master-0 kubenswrapper[4010]: I0319 09:19:37.742137 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:37.742452 master-0 kubenswrapper[4010]: E0319 09:19:37.742297 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:37.742536 master-0 kubenswrapper[4010]: I0319 09:19:37.742408 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:37.742536 master-0 kubenswrapper[4010]: I0319 09:19:37.742514 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:37.742605 master-0 kubenswrapper[4010]: E0319 09:19:37.742549 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:37.742605 master-0 kubenswrapper[4010]: I0319 09:19:37.742578 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:37.742662 master-0 kubenswrapper[4010]: E0319 09:19:37.742617 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.742595728 +0000 UTC m=+159.268540335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:37.742700 master-0 kubenswrapper[4010]: I0319 09:19:37.742661 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:37.742700 master-0 kubenswrapper[4010]: I0319 09:19:37.742689 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:37.742767 master-0 kubenswrapper[4010]: E0319 09:19:37.742718 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.742685751 +0000 UTC m=+159.268630358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:37.742808 master-0 kubenswrapper[4010]: E0319 09:19:37.742774 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:37.742808 master-0 kubenswrapper[4010]: E0319 09:19:37.742798 4010 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:37.742876 master-0 kubenswrapper[4010]: E0319 09:19:37.742834 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.742816684 +0000 UTC m=+159.268761291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:37.742876 master-0 kubenswrapper[4010]: E0319 09:19:37.742841 4010 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:37.742979 master-0 kubenswrapper[4010]: E0319 09:19:37.742908 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.742843565 +0000 UTC m=+159.268788172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:37.742979 master-0 kubenswrapper[4010]: E0319 09:19:37.742927 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.742919367 +0000 UTC m=+159.268864064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:37.743165 master-0 kubenswrapper[4010]: E0319 09:19:37.743075 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:37.743165 master-0 kubenswrapper[4010]: E0319 09:19:37.743103 4010 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:37.743289 master-0 kubenswrapper[4010]: E0319 09:19:37.743112 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.743103531 +0000 UTC m=+159.269048138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:37.743289 master-0 kubenswrapper[4010]: E0319 09:19:37.743111 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:37.743289 master-0 kubenswrapper[4010]: E0319 09:19:37.743282 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.743272866 +0000 UTC m=+159.269217473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:37.743412 master-0 kubenswrapper[4010]: E0319 09:19:37.743336 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.743312817 +0000 UTC m=+159.269257434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:37.745552 master-0 kubenswrapper[4010]: I0319 09:19:37.745455 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:37.746228 master-0 kubenswrapper[4010]: I0319 09:19:37.745613 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:37.746228 master-0 kubenswrapper[4010]: E0319 09:19:37.745646 4010 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:37.746228 master-0 kubenswrapper[4010]: E0319 09:19:37.745687 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.745675508 +0000 UTC m=+159.271620195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:37.746228 master-0 kubenswrapper[4010]: I0319 09:19:37.745649 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:37.746963 master-0 kubenswrapper[4010]: E0319 09:19:37.745741 4010 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:37.747043 master-0 kubenswrapper[4010]: E0319 09:19:37.747021 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.747000822 +0000 UTC m=+159.272945489 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:37.747104 master-0 kubenswrapper[4010]: E0319 09:19:37.745780 4010 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:37.747104 master-0 kubenswrapper[4010]: E0319 09:19:37.747090 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:39.747079584 +0000 UTC m=+159.273024301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:39.770249 master-0 kubenswrapper[4010]: I0319 09:19:39.770176 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: I0319 09:19:39.770411 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: I0319 09:19:39.770459 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770454 4010 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770646 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.770606812 +0000 UTC m=+163.296551599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: I0319 09:19:39.770496 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: I0319 09:19:39.770796 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: I0319 09:19:39.770863 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770810 4010 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770951 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.7709342 +0000 UTC m=+163.296878807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: I0319 09:19:39.770893 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770980 4010 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.771016 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770851 4010 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.771036 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771018912 +0000 UTC m=+163.296963509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.770889 4010 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:39.771046 master-0 kubenswrapper[4010]: E0319 09:19:39.771053 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771045313 +0000 UTC m=+163.296989920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771068 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771059153 +0000 UTC m=+163.297003750 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771081 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771074883 +0000 UTC m=+163.297019490 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: I0319 09:19:39.770984 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771096 4010 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: I0319 09:19:39.771109 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771118 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771112304 +0000 UTC m=+163.297056911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: I0319 09:19:39.771135 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771149 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771168 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771162606 +0000 UTC m=+163.297107213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: I0319 09:19:39.771182 4010 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771201 4010 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771220 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771215187 +0000 UTC m=+163.297159794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771237 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:39.771760 master-0 kubenswrapper[4010]: E0319 09:19:39.771257 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771251878 +0000 UTC m=+163.297196485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:39.772186 master-0 kubenswrapper[4010]: E0319 09:19:39.771282 4010 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:39.772186 master-0 kubenswrapper[4010]: E0319 09:19:39.771317 4010 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.771298849 +0000 UTC m=+163.297243456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:42.026799 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 09:19:42.053701 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 09:19:42.053994 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 09:19:42.054957 master-0 systemd[1]: kubelet.service: Consumed 10.189s CPU time. Mar 19 09:19:42.070063 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:19:42.164741 master-0 kubenswrapper[7518]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:19:42.164741 master-0 kubenswrapper[7518]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:19:42.164741 master-0 kubenswrapper[7518]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:19:42.164741 master-0 kubenswrapper[7518]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:19:42.164741 master-0 kubenswrapper[7518]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:19:42.164741 master-0 kubenswrapper[7518]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:19:42.165893 master-0 kubenswrapper[7518]: I0319 09:19:42.164827 7518 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:19:42.167343 master-0 kubenswrapper[7518]: W0319 09:19:42.167316 7518 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:19:42.167343 master-0 kubenswrapper[7518]: W0319 09:19:42.167330 7518 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:19:42.167343 master-0 kubenswrapper[7518]: W0319 09:19:42.167335 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:19:42.167343 master-0 kubenswrapper[7518]: W0319 09:19:42.167339 7518 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:19:42.167343 master-0 kubenswrapper[7518]: W0319 09:19:42.167343 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:19:42.167343 master-0 kubenswrapper[7518]: W0319 09:19:42.167349 7518 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167354 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167358 7518 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167362 7518 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167366 7518 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167369 7518 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167373 7518 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167376 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167385 7518 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167389 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167393 7518 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167398 7518 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167402 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167406 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167409 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167414 7518 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167419 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167422 7518 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167427 7518 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:19:42.167552 master-0 kubenswrapper[7518]: W0319 09:19:42.167430 7518 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167434 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167439 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167442 7518 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167446 7518 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167450 7518 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167453 7518 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167458 7518 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167462 7518 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167481 7518 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167486 7518 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167490 7518 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167493 7518 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167497 7518 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167501 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167505 7518 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167508 7518 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167512 7518 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167516 7518 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167519 7518 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:19:42.168175 master-0 kubenswrapper[7518]: W0319 09:19:42.167523 7518 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167526 7518 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167529 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167533 7518 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167537 7518 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167541 7518 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167545 7518 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167548 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167552 7518 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167555 7518 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167558 7518 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167562 7518 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167565 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167569 7518 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167572 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167576 7518 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167580 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167583 7518 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167587 7518 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167590 7518 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:19:42.168797 master-0 kubenswrapper[7518]: W0319 09:19:42.167593 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167605 7518 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167608 7518 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167612 7518 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167615 7518 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167619 7518 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167622 7518 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: W0319 09:19:42.167625 7518 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167709 7518 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167717 7518 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167723 7518 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167728 7518 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167733 7518 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167738 7518 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167743 7518 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167749 7518 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167753 7518 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167757 7518 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167761 7518 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167766 7518 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167770 7518 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167775 7518 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167779 7518 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:19:42.169350 master-0 kubenswrapper[7518]: I0319 09:19:42.167783 7518 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167787 7518 flags.go:64] FLAG: --cloud-config="" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167791 7518 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167795 7518 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167800 7518 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167804 7518 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167808 7518 flags.go:64] FLAG: --config-dir="" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167812 7518 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167817 7518 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167822 7518 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167829 7518 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167833 7518 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167838 7518 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167842 7518 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167846 7518 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167850 7518 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167855 7518 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167859 7518 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167864 7518 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167868 7518 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167872 7518 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167876 7518 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167881 7518 flags.go:64] FLAG: --enable-server="true" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167885 7518 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167891 7518 flags.go:64] FLAG: --event-burst="100" Mar 19 09:19:42.170047 master-0 kubenswrapper[7518]: I0319 09:19:42.167896 7518 flags.go:64] FLAG: --event-qps="50" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167900 7518 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167904 7518 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167909 7518 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167914 7518 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167935 7518 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167940 7518 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167945 7518 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167950 7518 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167954 7518 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167959 7518 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167963 7518 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167967 7518 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167971 7518 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167975 7518 flags.go:64] FLAG: --feature-gates="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167980 7518 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167985 7518 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167991 7518 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.167996 7518 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168000 7518 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168004 7518 flags.go:64] FLAG: --help="false" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168008 7518 flags.go:64] FLAG: --hostname-override="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168012 7518 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168016 7518 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168020 7518 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:19:42.170754 master-0 kubenswrapper[7518]: I0319 09:19:42.168024 7518 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168028 7518 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168032 7518 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168036 7518 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168040 7518 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168044 7518 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168048 7518 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168053 7518 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168057 7518 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168061 7518 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168065 7518 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168069 7518 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168073 7518 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168077 7518 flags.go:64] FLAG: --lock-file="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168081 7518 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168085 7518 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168090 7518 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168096 7518 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168100 7518 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168105 7518 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168110 7518 flags.go:64] FLAG: --logging-format="text" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168114 7518 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168119 7518 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168123 7518 flags.go:64] FLAG: --manifest-url="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168129 7518 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:19:42.171480 master-0 kubenswrapper[7518]: I0319 09:19:42.168134 7518 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168139 7518 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168144 7518 flags.go:64] FLAG: --max-pods="110" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168149 7518 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168153 7518 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168157 7518 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168161 7518 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168166 7518 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168170 7518 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168174 7518 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168183 7518 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168188 7518 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168192 7518 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168196 7518 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168200 7518 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168206 7518 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168210 7518 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168215 7518 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168219 7518 flags.go:64] FLAG: --port="10250" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168223 7518 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168228 7518 flags.go:64] FLAG: --provider-id="" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168232 7518 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168236 7518 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168240 7518 flags.go:64] FLAG: --register-node="true" Mar 19 09:19:42.172289 master-0 kubenswrapper[7518]: I0319 09:19:42.168244 7518 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168248 7518 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168255 7518 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168259 7518 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168263 7518 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168267 7518 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168272 7518 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168277 7518 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168288 7518 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168292 7518 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168297 7518 flags.go:64] FLAG: --runonce="false" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168301 7518 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168305 7518 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168310 7518 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168314 7518 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168318 7518 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168322 7518 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168326 7518 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168331 7518 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168335 7518 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168339 7518 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168343 7518 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168347 7518 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168351 7518 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168356 7518 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:19:42.172886 master-0 kubenswrapper[7518]: I0319 09:19:42.168359 7518 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168366 7518 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168370 7518 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168374 7518 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168379 7518 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168383 7518 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168387 7518 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168391 7518 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168396 7518 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168401 7518 flags.go:64] FLAG: --v="2" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168409 7518 flags.go:64] FLAG: --version="false" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168415 7518 flags.go:64] FLAG: --vmodule="" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168420 7518 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: I0319 09:19:42.168425 7518 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168555 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168560 7518 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168564 7518 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168568 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168572 7518 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168576 7518 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168580 7518 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168583 7518 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168587 7518 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:19:42.174011 master-0 kubenswrapper[7518]: W0319 09:19:42.168591 7518 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168594 7518 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168598 7518 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168601 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168605 7518 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168608 7518 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168612 7518 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168615 7518 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168619 7518 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168622 7518 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168625 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168629 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168633 7518 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168637 7518 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168640 7518 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168644 7518 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168648 7518 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168651 7518 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168655 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:19:42.174654 master-0 kubenswrapper[7518]: W0319 09:19:42.168658 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168663 7518 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168668 7518 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168672 7518 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168675 7518 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168679 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168683 7518 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168687 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168691 7518 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168695 7518 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168698 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168702 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168706 7518 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168710 7518 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168713 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168717 7518 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168720 7518 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168724 7518 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168729 7518 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:19:42.175261 master-0 kubenswrapper[7518]: W0319 09:19:42.168733 7518 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168737 7518 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168741 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168745 7518 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168749 7518 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168753 7518 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168757 7518 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168761 7518 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168764 7518 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168768 7518 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168772 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168775 7518 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168779 7518 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168784 7518 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168788 7518 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168793 7518 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168797 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168801 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168804 7518 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168808 7518 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:19:42.175888 master-0 kubenswrapper[7518]: W0319 09:19:42.168812 7518 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: W0319 09:19:42.168816 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: W0319 09:19:42.168820 7518 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: W0319 09:19:42.168824 7518 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: W0319 09:19:42.168828 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: I0319 09:19:42.168840 7518 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: I0319 09:19:42.176669 7518 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 09:19:42.176735 master-0 kubenswrapper[7518]: I0319 09:19:42.176703 7518 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176777 7518 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176786 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176791 7518 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176795 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176801 7518 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176805 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176809 7518 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176813 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176817 7518 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176821 7518 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176825 7518 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176829 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176833 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176839 7518 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176843 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176849 7518 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176855 7518 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176861 7518 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176865 7518 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:19:42.176940 master-0 kubenswrapper[7518]: W0319 09:19:42.176871 7518 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176876 7518 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176881 7518 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176886 7518 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176892 7518 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176898 7518 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176903 7518 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176907 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176912 7518 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176916 7518 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176921 7518 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176925 7518 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176930 7518 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176934 7518 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176940 7518 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176945 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176950 7518 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176954 7518 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176959 7518 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176963 7518 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:19:42.177504 master-0 kubenswrapper[7518]: W0319 09:19:42.176967 7518 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176972 7518 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176976 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176980 7518 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176984 7518 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176988 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176993 7518 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.176997 7518 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177002 7518 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177006 7518 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177011 7518 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177016 7518 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177019 7518 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177023 7518 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177027 7518 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177031 7518 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177035 7518 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177040 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177043 7518 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:19:42.178182 master-0 kubenswrapper[7518]: W0319 09:19:42.177047 7518 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177051 7518 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177055 7518 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177059 7518 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177062 7518 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177068 7518 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177072 7518 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177077 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177081 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177086 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177090 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177094 7518 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177098 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177102 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: I0319 09:19:42.177109 7518 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:19:42.178804 master-0 kubenswrapper[7518]: W0319 09:19:42.177282 7518 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177293 7518 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177298 7518 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177303 7518 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177307 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177312 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177316 7518 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177321 7518 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177327 7518 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177332 7518 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177337 7518 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177341 7518 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177345 7518 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177350 7518 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177355 7518 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177359 7518 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177363 7518 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177367 7518 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177372 7518 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177376 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:19:42.179326 master-0 kubenswrapper[7518]: W0319 09:19:42.177380 7518 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177384 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177387 7518 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177392 7518 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177396 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177400 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177406 7518 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177410 7518 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177414 7518 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177418 7518 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177422 7518 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177425 7518 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177429 7518 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177433 7518 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177437 7518 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177441 7518 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177445 7518 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177449 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177454 7518 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177459 7518 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:19:42.179913 master-0 kubenswrapper[7518]: W0319 09:19:42.177479 7518 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177484 7518 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177489 7518 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177494 7518 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177498 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177503 7518 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177509 7518 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177514 7518 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177518 7518 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177524 7518 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177530 7518 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177536 7518 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177540 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177545 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177549 7518 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177554 7518 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177559 7518 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177563 7518 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177568 7518 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177572 7518 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:19:42.180490 master-0 kubenswrapper[7518]: W0319 09:19:42.177576 7518 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177580 7518 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177585 7518 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177589 7518 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177592 7518 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177597 7518 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177601 7518 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177606 7518 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177611 7518 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177616 7518 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177620 7518 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: W0319 09:19:42.177625 7518 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: I0319 09:19:42.177631 7518 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: I0319 09:19:42.177821 7518 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:19:42.181933 master-0 kubenswrapper[7518]: I0319 09:19:42.179546 7518 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.179638 7518 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.179890 7518 server.go:997] "Starting client certificate rotation" Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.179919 7518 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.180102 7518 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 02:30:32.337112716 +0000 UTC Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.180154 7518 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h10m50.156961311s for next certificate rotation Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.180639 7518 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:19:42.182550 master-0 kubenswrapper[7518]: I0319 09:19:42.182102 7518 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:19:42.186525 master-0 kubenswrapper[7518]: I0319 09:19:42.186497 7518 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:19:42.189376 master-0 kubenswrapper[7518]: I0319 09:19:42.189336 7518 log.go:25] "Validated CRI v1 image API" Mar 19 09:19:42.191009 master-0 kubenswrapper[7518]: I0319 09:19:42.190971 7518 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:19:42.196398 master-0 kubenswrapper[7518]: I0319 09:19:42.196291 7518 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 a870f5cc-57ed-47cd-b7c0-f85f1fc0e63d:/dev/vda3] Mar 19 09:19:42.197641 master-0 kubenswrapper[7518]: I0319 09:19:42.196382 7518 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56/userdata/shm major:0 minor:141 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204/userdata/shm major:0 minor:103 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42/userdata/shm major:0 minor:147 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c/userdata/shm major:0 minor:247 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687/userdata/shm major:0 minor:86 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43/userdata/shm major:0 minor:126 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~projected/kube-api-access-mlwd5:{mountpoint:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~projected/kube-api-access-mlwd5 major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~projected/kube-api-access-47czp:{mountpoint:/var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~projected/kube-api-access-47czp major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~projected/kube-api-access-lktk8:{mountpoint:/var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~projected/kube-api-access-lktk8 major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/32b1ae47-ef83-448d-b40d-a836cb6c6fc0/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/32b1ae47-ef83-448d-b40d-a836cb6c6fc0/volumes/kubernetes.io~projected/kube-api-access major:0 minor:107 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~projected/kube-api-access-npxz5:{mountpoint:/var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~projected/kube-api-access-npxz5 major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~projected/kube-api-access-8zvxj:{mountpoint:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~projected/kube-api-access-8zvxj major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~projected/kube-api-access-ptcvr:{mountpoint:/var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~projected/kube-api-access-ptcvr major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~projected/kube-api-access-7k8wj:{mountpoint:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~projected/kube-api-access-7k8wj major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~projected/kube-api-access-4n2hg:{mountpoint:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~projected/kube-api-access-4n2hg major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~projected/kube-api-access major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:259 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/kube-api-access-548cd:{mountpoint:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/kube-api-access-548cd major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~projected/kube-api-access-qv8vk:{mountpoint:/var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~projected/kube-api-access-qv8vk major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~projected/kube-api-access-qh4t8:{mountpoint:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~projected/kube-api-access-qh4t8 major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~projected/kube-api-access-qn48v:{mountpoint:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~projected/kube-api-access-qn48v major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/872e5f8c-b014-4283-a4d2-0e2cfd29e192/volumes/kubernetes.io~projected/kube-api-access-kfpv6:{mountpoint:/var/lib/kubelet/pods/872e5f8c-b014-4283-a4d2-0e2cfd29e192/volumes/kubernetes.io~projected/kube-api-access-kfpv6 major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~projected/kube-api-access-ft9rs:{mountpoint:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~projected/kube-api-access-ft9rs major:0 minor:99 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~projected/kube-api-access-m4rtm:{mountpoint:/var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~projected/kube-api-access-m4rtm major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~projected/kube-api-access-n6zkv:{mountpoint:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~projected/kube-api-access-n6zkv major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/etcd-client major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~projected/kube-api-access-fp46p:{mountpoint:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~projected/kube-api-access-fp46p major:0 minor:135 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:134 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~projected/kube-api-access-bgmwd:{mountpoint:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~projected/kube-api-access-bgmwd major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/kube-api-access-wdmtg:{mountpoint:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/kube-api-access-wdmtg major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~projected/kube-api-access-4mvqh:{mountpoint:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~projected/kube-api-access-4mvqh major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~projected/kube-api-access major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~projected/kube-api-access-x9zg8:{mountpoint:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~projected/kube-api-access-x9zg8 major:0 minor:148 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~secret/webhook-cert major:0 minor:146 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bec90db1-02e3-4211-8c33-f8bcc304e3a7/volumes/kubernetes.io~projected/kube-api-access-nr5cd:{mountpoint:/var/lib/kubelet/pods/bec90db1-02e3-4211-8c33-f8bcc304e3a7/volumes/kubernetes.io~projected/kube-api-access-nr5cd major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d/volumes/kubernetes.io~projected/kube-api-access-kxv42:{mountpoint:/var/lib/kubelet/pods/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d/volumes/kubernetes.io~projected/kube-api-access-kxv42 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e09725c2-45c6-4a60-b817-6e5316d6f8e8/volumes/kubernetes.io~projected/kube-api-access-b49lj:{mountpoint:/var/lib/kubelet/pods/e09725c2-45c6-4a60-b817-6e5316d6f8e8/volumes/kubernetes.io~projected/kube-api-access-b49lj major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e9ebcecb-c210-434e-83a1-825265e206f1/volumes/kubernetes.io~projected/kube-api-access-txxpw:{mountpoint:/var/lib/kubelet/pods/e9ebcecb-c210-434e-83a1-825265e206f1/volumes/kubernetes.io~projected/kube-api-access-txxpw major:0 minor:110 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~projected/kube-api-access-pvq8m:{mountpoint:/var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~projected/kube-api-access-pvq8m major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~projected/kube-api-access-bd8nz:{mountpoint:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~projected/kube-api-access-bd8nz major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/3b97a8f1b9aa695d2b507e36574de13fb7b2d893d3f7915c2ff188f5dfa3ce89/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/adeb790a2e31fe1b9b264b3c4c44d9de1f0407f9dae672fec0bc6364f6ae83fd/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-115:{mountpoint:/var/lib/containers/storage/overlay/eb3b4089c342ee40a11532342bb40930057348f834307b9bc8e75b6c1971f7a3/merged major:0 minor:115 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/7c1d03227cc514591244dc86b0488b2dd57a2f0120c552b2e59e9b0db70adbda/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-128:{mountpoint:/var/lib/containers/storage/overlay/27837f8a552897312a5df71dc51f67a7f2566ef138ed2ef3eb8acff9f26aa0c2/merged major:0 minor:128 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/cc31dfd2379c9971b69afc9947cf4c7ac982d3db70c2b85aa2250c5bc552f1e4/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/664defcb16835a98985b80e688f010e510b521e3c7bbf8694bec00a9717f5846/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-143:{mountpoint:/var/lib/containers/storage/overlay/16e4be12f350c7085bb021957915aa09d93d058a8fbd6f621bf8850678f5b02d/merged major:0 minor:143 fsType:overlay blockSize:0} overlay_0-145:{mountpoint:/var/lib/containers/storage/overlay/f0ce9109e3f2ea0960d438756d371d4f67f25534e78d031377afc9d510035bc6/merged major:0 minor:145 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/3a803f486f6d8c4b685d88ab54fcdb5f980a4adc9a2570fdcc385ad92adbd79c/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/ec02cc5b0439bdac0cfd0368863e0a2b0c115ebac054de299ec787fb68b05be1/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/92ed9422a8a52a0b815c358138f5c4ce5582906c2a8e1b9c290049a52c6b7a0b/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/12769642dae853082d9d0189953db8bea65bcfec768237b64fe7dda87fbc2a20/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-165:{mountpoint:/var/lib/containers/storage/overlay/0e6e6544a0f5b188fde00cb534b87d94aa8cffc724a593eacd40cccdd26bdc64/merged major:0 minor:165 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/75fb4672b66560f1f37663437160beb7c1238bb2a2a11bdf8de39e030c94f89e/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/88d58b0aef7f9e8851dfa4e94ff3c2f1a038ffd451a5d817070acf94fe3c0436/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/1a268ffb078841f019c6c17a4b112c9528c058506abaa493dd32ddee8a90c13d/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/9604cc5fda4c68a8ea9e20e3cfbf191b34fee6153482c301760a9b4f1840a1de/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/badc0396407a89da87562b0406ef41865cfbc02c14b3633ca5291e241772ec31/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/639af4189b80d4c11315fdecd4017cc1666a4da28b619bde56a08992623a7f2d/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/db818d7ee7fbc6743e6cf78d723adedb29810fd455fac9767ee4c5b5faf10fae/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-264:{mountpoint:/var/lib/containers/storage/overlay/8dda2f860fbc36b9f750d0c72fbdbf3e38b965f3f0a7e8d692add25e1b8f7ee8/merged major:0 minor:264 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/f5fa9ce224d3935abdab9fcc2d7b187841600879a7b8506f76f36d505b15bb06/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/ff0ae64d4e511bdd3c156810246aa5c2a6bed651e0fe62a07f14ce1814b4ef48/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/b6713f448c2700caf59c43d9e781c3675ba3cb1c595933f30146cfd29a152c8b/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/5fde0e7afae152fde48b25fba2aa481ad10fa1cff01edde0fd4aa56aa5fc5ecc/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/1a5d06ca1cfee170c56823e73d61225cede0610e23730ad975030ebb61a4f92c/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/ed0c97368b53bd6e86d24abb85153543a3379fe0eee724199d2ad22ea19aa1df/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/1ae6a436a8f1d4191e9028cc4b927bac6a164407f90cbfdf5502380940e02ef7/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/ced8babf02c9aad5924da9fcea6dc8cf9e8d31ac96d234acc2a3b23b155c3343/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/39ff757ba9cb89cbaad35b01108900675af235291146ab968ab26a1e3909d92f/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/2d3aa458443bac5d28c98dcc664d0b59520eb448dd976ff47d4132081c46d4c9/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/1e941401408c85a846df02d8c24cef1fbd649f6297281f42aa2244993161f1e9/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/586d844bb76f98c7a57c9d9aa798f3ff6558459d47ef4d0272f71008734dff3b/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/fd77557ff2dd557b9dbb3cbd729e994fa36c6975a8dfe8ab15d66fee2bf4912f/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/10136155de1f48ef9cda25f8666729574feebba423dd63d6bad6ae979448b7d5/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/447c600a7e5282250d958d6c3f7137108cfcf9d2fd7207d2f17d64f5b43dda72/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/078dd01a8ad8f658667d5881c4786ee8455a5e43bbb4e3f9545c36832d75a668/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/72d7a3701ba3ba27c47276741c6a79bb2abeae81ae36c32502b637a3921b5010/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/28de7bec1a409f75e2b93c9f2c19b29ac95ea4778c28e02079796967ed0b702d/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/dc22d5a1a03a131427d82457194270306ece212f2bec56a2e76363273b7d0db5/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/3fdf8dc232536b7803c4eeff9a3e422db6e10e0fa732d30f44ba3b95b1a961e6/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/21f5e3839d6d1c4e1bf487198d7e6adda5970671c7d7304b16ae43d32a240652/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/8589547db06a264a2aaf3bb9cc6c6bc4a80e196e507ce875b211bd539d438837/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/124f026298bffd3a97c4b63309edd7819188935362e954fc10e5aa24defd376b/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/fae2cd0de4e8c186884dab314cb2a8f3c871e3420f8b90c0de0c23451cf2457c/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-94:{mountpoint:/var/lib/containers/storage/overlay/736c855c11143b2febe5c2368176f65100014d8eae83418779a48d72c33bceae/merged major:0 minor:94 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/ec0900ab3eb05ea93c039834024496f79fb3a92342403d7ec1f84d0b857dcf42/merged major:0 minor:96 fsType:overlay blockSize:0}] Mar 19 09:19:42.219809 master-0 kubenswrapper[7518]: I0319 09:19:42.218961 7518 manager.go:217] Machine: {Timestamp:2026-03-19 09:19:42.217760053 +0000 UTC m=+0.100343352 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:81766350eaa9426e82b63b9a7bdd6612 SystemUUID:81766350-eaa9-426e-82b6-3b9a7bdd6612 BootID:183da118-c1b7-4287-af5d-a72bb0b1fda1 Filesystems:[{Device:/var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~projected/kube-api-access-npxz5 DeviceMajor:0 DeviceMinor:250 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/32b1ae47-ef83-448d-b40d-a836cb6c6fc0/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:107 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e9ebcecb-c210-434e-83a1-825265e206f1/volumes/kubernetes.io~projected/kube-api-access-txxpw DeviceMajor:0 DeviceMinor:110 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~projected/kube-api-access-bgmwd DeviceMajor:0 DeviceMinor:233 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~projected/kube-api-access-bd8nz DeviceMajor:0 DeviceMinor:240 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-115 DeviceMajor:0 DeviceMinor:115 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687/userdata/shm DeviceMajor:0 DeviceMinor:86 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:243 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-143 DeviceMajor:0 DeviceMinor:143 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~projected/kube-api-access-m4rtm DeviceMajor:0 DeviceMinor:227 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~projected/kube-api-access-mlwd5 DeviceMajor:0 DeviceMinor:234 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-128 DeviceMajor:0 DeviceMinor:128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-145 DeviceMajor:0 DeviceMinor:145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~projected/kube-api-access-n6zkv DeviceMajor:0 DeviceMinor:238 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-165 DeviceMajor:0 DeviceMinor:165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204/userdata/shm DeviceMajor:0 DeviceMinor:103 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/872e5f8c-b014-4283-a4d2-0e2cfd29e192/volumes/kubernetes.io~projected/kube-api-access-kfpv6 DeviceMajor:0 DeviceMinor:43 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~projected/kube-api-access-pvq8m DeviceMajor:0 DeviceMinor:235 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~projected/kube-api-access-ptcvr DeviceMajor:0 DeviceMinor:123 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~projected/kube-api-access-qh4t8 DeviceMajor:0 DeviceMinor:237 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~projected/kube-api-access-qn48v DeviceMajor:0 DeviceMinor:232 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bec90db1-02e3-4211-8c33-f8bcc304e3a7/volumes/kubernetes.io~projected/kube-api-access-nr5cd DeviceMajor:0 DeviceMinor:245 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~projected/kube-api-access-4mvqh DeviceMajor:0 DeviceMinor:242 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~projected/kube-api-access-4n2hg DeviceMajor:0 DeviceMinor:125 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/kube-api-access-548cd DeviceMajor:0 DeviceMinor:241 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-264 DeviceMajor:0 DeviceMinor:264 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:223 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:226 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/kube-api-access-wdmtg DeviceMajor:0 DeviceMinor:229 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42/userdata/shm DeviceMajor:0 DeviceMinor:147 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d/volumes/kubernetes.io~projected/kube-api-access-kxv42 DeviceMajor:0 DeviceMinor:230 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:225 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~projected/kube-api-access-ft9rs DeviceMajor:0 DeviceMinor:99 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~projected/kube-api-access-x9zg8 DeviceMajor:0 DeviceMinor:148 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~projected/kube-api-access-qv8vk DeviceMajor:0 DeviceMinor:239 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43/userdata/shm DeviceMajor:0 DeviceMinor:126 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e09725c2-45c6-4a60-b817-6e5316d6f8e8/volumes/kubernetes.io~projected/kube-api-access-b49lj DeviceMajor:0 DeviceMinor:244 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~projected/kube-api-access-lktk8 DeviceMajor:0 DeviceMinor:253 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:134 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-94 DeviceMajor:0 DeviceMinor:94 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:146 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~projected/kube-api-access-8zvxj DeviceMajor:0 DeviceMinor:228 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~projected/kube-api-access-fp46p DeviceMajor:0 DeviceMinor:135 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56/userdata/shm DeviceMajor:0 DeviceMinor:141 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c/userdata/shm DeviceMajor:0 DeviceMinor:247 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~projected/kube-api-access-7k8wj DeviceMajor:0 DeviceMinor:224 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~projected/kube-api-access-47czp DeviceMajor:0 DeviceMinor:231 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:259 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0330a59f41759e2 MacAddress:ce:a8:8a:51:b4:fe Speed:10000 Mtu:8900} {Name:16baf9775f985e1 MacAddress:02:72:ae:5f:f9:8b Speed:10000 Mtu:8900} {Name:53ca7c2bbb87601 MacAddress:1e:fc:65:15:b0:d8 Speed:10000 Mtu:8900} {Name:75f211854713a8c MacAddress:ee:0a:21:76:6c:9d Speed:10000 Mtu:8900} {Name:7778d952a406316 MacAddress:ce:44:e1:04:08:30 Speed:10000 Mtu:8900} {Name:821c52c85783248 MacAddress:1a:23:ae:ff:65:fc Speed:10000 Mtu:8900} {Name:8ff5a0a197bf95e MacAddress:e2:af:17:bc:be:bc Speed:10000 Mtu:8900} {Name:948a9c37f749c61 MacAddress:b2:b9:56:45:32:6c Speed:10000 Mtu:8900} {Name:a2636e526bcfbc7 MacAddress:ea:76:86:b2:1c:e8 Speed:10000 Mtu:8900} {Name:b892eee40a06455 MacAddress:f2:b7:c1:f2:45:60 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:c6:4a:11:0c:41:f4 Speed:0 Mtu:8900} {Name:c1983dec9f8f8a4 MacAddress:66:2c:8c:3d:05:43 Speed:10000 Mtu:8900} {Name:e927634c086b213 MacAddress:7e:95:33:52:a5:ed Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:d5:00:d5 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:d6:48:60:15:e4:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:19:42.219809 master-0 kubenswrapper[7518]: I0319 09:19:42.219785 7518 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:19:42.220303 master-0 kubenswrapper[7518]: I0319 09:19:42.220081 7518 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:19:42.220632 master-0 kubenswrapper[7518]: I0319 09:19:42.220562 7518 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:19:42.220916 master-0 kubenswrapper[7518]: I0319 09:19:42.220847 7518 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:19:42.221258 master-0 kubenswrapper[7518]: I0319 09:19:42.220905 7518 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:19:42.221359 master-0 kubenswrapper[7518]: I0319 09:19:42.221294 7518 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:19:42.221359 master-0 kubenswrapper[7518]: I0319 09:19:42.221313 7518 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:19:42.221359 master-0 kubenswrapper[7518]: I0319 09:19:42.221331 7518 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:19:42.221535 master-0 kubenswrapper[7518]: I0319 09:19:42.221373 7518 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:19:42.221626 master-0 kubenswrapper[7518]: I0319 09:19:42.221593 7518 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:19:42.221748 master-0 kubenswrapper[7518]: I0319 09:19:42.221715 7518 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:19:42.221835 master-0 kubenswrapper[7518]: I0319 09:19:42.221812 7518 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:19:42.221835 master-0 kubenswrapper[7518]: I0319 09:19:42.221832 7518 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:19:42.221909 master-0 kubenswrapper[7518]: I0319 09:19:42.221856 7518 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:19:42.221909 master-0 kubenswrapper[7518]: I0319 09:19:42.221873 7518 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:19:42.221909 master-0 kubenswrapper[7518]: I0319 09:19:42.221899 7518 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:19:42.223870 master-0 kubenswrapper[7518]: I0319 09:19:42.223784 7518 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 09:19:42.224206 master-0 kubenswrapper[7518]: I0319 09:19:42.224169 7518 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.224907 7518 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225072 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225093 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225100 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225108 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225115 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225124 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225132 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225139 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225153 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225161 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225172 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225185 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:19:42.225547 master-0 kubenswrapper[7518]: I0319 09:19:42.225241 7518 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:19:42.226382 master-0 kubenswrapper[7518]: I0319 09:19:42.226336 7518 server.go:1280] "Started kubelet" Mar 19 09:19:42.227577 master-0 kubenswrapper[7518]: I0319 09:19:42.227414 7518 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:19:42.227862 master-0 kubenswrapper[7518]: I0319 09:19:42.227613 7518 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 09:19:42.228591 master-0 kubenswrapper[7518]: I0319 09:19:42.228534 7518 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:19:42.229042 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 09:19:42.231646 master-0 kubenswrapper[7518]: I0319 09:19:42.231605 7518 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:19:42.240569 master-0 kubenswrapper[7518]: I0319 09:19:42.240515 7518 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:19:42.240696 master-0 kubenswrapper[7518]: I0319 09:19:42.240661 7518 server.go:449] "Adding debug handlers to kubelet server" Mar 19 09:19:42.241339 master-0 kubenswrapper[7518]: I0319 09:19:42.241306 7518 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:19:42.243343 master-0 kubenswrapper[7518]: I0319 09:19:42.243303 7518 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:19:42.243422 master-0 kubenswrapper[7518]: I0319 09:19:42.243349 7518 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:19:42.243490 master-0 kubenswrapper[7518]: I0319 09:19:42.243430 7518 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:19:42.243525 master-0 kubenswrapper[7518]: I0319 09:19:42.243503 7518 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:19:42.243741 master-0 kubenswrapper[7518]: I0319 09:19:42.243430 7518 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 06:16:23.739613966 +0000 UTC Mar 19 09:19:42.243741 master-0 kubenswrapper[7518]: I0319 09:19:42.243707 7518 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 09:19:42.243840 master-0 kubenswrapper[7518]: I0319 09:19:42.243741 7518 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h56m41.495899117s for next certificate rotation Mar 19 09:19:42.245596 master-0 kubenswrapper[7518]: I0319 09:19:42.245552 7518 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:19:42.245686 master-0 kubenswrapper[7518]: I0319 09:19:42.245664 7518 factory.go:55] Registering systemd factory Mar 19 09:19:42.245732 master-0 kubenswrapper[7518]: I0319 09:19:42.245690 7518 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:19:42.246112 master-0 kubenswrapper[7518]: I0319 09:19:42.246083 7518 factory.go:153] Registering CRI-O factory Mar 19 09:19:42.246112 master-0 kubenswrapper[7518]: I0319 09:19:42.246100 7518 factory.go:221] Registration of the crio container factory successfully Mar 19 09:19:42.246225 master-0 kubenswrapper[7518]: I0319 09:19:42.246193 7518 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:19:42.246274 master-0 kubenswrapper[7518]: I0319 09:19:42.246226 7518 factory.go:103] Registering Raw factory Mar 19 09:19:42.246274 master-0 kubenswrapper[7518]: I0319 09:19:42.246243 7518 manager.go:1196] Started watching for new ooms in manager Mar 19 09:19:42.246777 master-0 kubenswrapper[7518]: I0319 09:19:42.246747 7518 manager.go:319] Starting recovery of all containers Mar 19 09:19:42.249459 master-0 kubenswrapper[7518]: I0319 09:19:42.249400 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="310d604b-fe9a-4b19-b8b5-7a1983e45e67" volumeName="kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access" seLinuxMountContext="" Mar 19 09:19:42.249459 master-0 kubenswrapper[7518]: I0319 09:19:42.249452 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249483 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" volumeName="kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249497 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ac42112-6a00-4c17-b230-75b565aa668f" volumeName="kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249511 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a823c8bc-09ef-46a9-a1f3-155a34b89788" volumeName="kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249524 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249538 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="083882c0-ea2f-4405-8cf1-cce5b91fe602" volumeName="kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249549 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249561 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" volumeName="kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets" seLinuxMountContext="" Mar 19 09:19:42.249563 master-0 kubenswrapper[7518]: I0319 09:19:42.249571 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249583 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86c4b0e4-3481-465d-b00f-022d2c58c183" volumeName="kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249595 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823" volumeName="kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249608 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a75049de-dcf1-4102-b339-f45d5015adea" volumeName="kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249620 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249629 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" volumeName="kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249638 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249647 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823" volumeName="kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249671 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="208939f5-8fca-4fd5-b0c6-43484b7d1e30" volumeName="kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249683 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="310d604b-fe9a-4b19-b8b5-7a1983e45e67" volumeName="kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249694 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249706 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ac42112-6a00-4c17-b230-75b565aa668f" volumeName="kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249718 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a823c8bc-09ef-46a9-a1f3-155a34b89788" volumeName="kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249732 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f2148fe-f9f6-47da-894c-b88dae360ebe" volumeName="kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249743 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" volumeName="kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249755 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249766 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" volumeName="kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249780 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" volumeName="kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.249776 master-0 kubenswrapper[7518]: I0319 09:19:42.249793 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86c4b0e4-3481-465d-b00f-022d2c58c183" volumeName="kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250214 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="872e5f8c-b014-4283-a4d2-0e2cfd29e192" volumeName="kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250332 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250358 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" volumeName="kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250395 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bec90db1-02e3-4211-8c33-f8bcc304e3a7" volumeName="kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250414 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250440 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250459 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" volumeName="kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz" seLinuxMountContext="" Mar 19 09:19:42.250512 master-0 kubenswrapper[7518]: I0319 09:19:42.250500 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides" seLinuxMountContext="" Mar 19 09:19:42.250893 master-0 kubenswrapper[7518]: I0319 09:19:42.250529 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg" seLinuxMountContext="" Mar 19 09:19:42.250893 master-0 kubenswrapper[7518]: I0319 09:19:42.250549 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" volumeName="kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk" seLinuxMountContext="" Mar 19 09:19:42.250893 master-0 kubenswrapper[7518]: I0319 09:19:42.250573 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="872e5f8c-b014-4283-a4d2-0e2cfd29e192" volumeName="kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config" seLinuxMountContext="" Mar 19 09:19:42.250893 master-0 kubenswrapper[7518]: I0319 09:19:42.250592 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p" seLinuxMountContext="" Mar 19 09:19:42.250893 master-0 kubenswrapper[7518]: I0319 09:19:42.250611 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bec90db1-02e3-4211-8c33-f8bcc304e3a7" volumeName="kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd" seLinuxMountContext="" Mar 19 09:19:42.251156 master-0 kubenswrapper[7518]: I0319 09:19:42.251090 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4256d841-23cb-4756-b827-f44ee6e54def" volumeName="kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr" seLinuxMountContext="" Mar 19 09:19:42.251224 master-0 kubenswrapper[7518]: I0319 09:19:42.251171 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86c4b0e4-3481-465d-b00f-022d2c58c183" volumeName="kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v" seLinuxMountContext="" Mar 19 09:19:42.251224 master-0 kubenswrapper[7518]: I0319 09:19:42.251192 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token" seLinuxMountContext="" Mar 19 09:19:42.251335 master-0 kubenswrapper[7518]: I0319 09:19:42.251228 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config" seLinuxMountContext="" Mar 19 09:19:42.251335 master-0 kubenswrapper[7518]: I0319 09:19:42.251250 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca" seLinuxMountContext="" Mar 19 09:19:42.251335 master-0 kubenswrapper[7518]: I0319 09:19:42.251280 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" volumeName="kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8" seLinuxMountContext="" Mar 19 09:19:42.251335 master-0 kubenswrapper[7518]: I0319 09:19:42.251299 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8aa0f17a-287e-4a19-9a59-4913e7707071" volumeName="kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm" seLinuxMountContext="" Mar 19 09:19:42.251335 master-0 kubenswrapper[7518]: I0319 09:19:42.251336 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251365 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a75049de-dcf1-4102-b339-f45d5015adea" volumeName="kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251382 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a823c8bc-09ef-46a9-a1f3-155a34b89788" volumeName="kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251415 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251446 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="083882c0-ea2f-4405-8cf1-cce5b91fe602" volumeName="kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251507 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251550 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ece5177b-ae15-4c33-a8d4-612ab50b2b8b" volumeName="kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m" seLinuxMountContext="" Mar 19 09:19:42.251581 master-0 kubenswrapper[7518]: I0319 09:19:42.251569 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251596 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b36f3b2-caf9-40ad-a3a1-e83796142f54" volumeName="kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251616 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33e92e5d-61ea-45b2-b357-ebffdaebf4af" volumeName="kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251668 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" volumeName="kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251684 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251709 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251729 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251753 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="083882c0-ea2f-4405-8cf1-cce5b91fe602" volumeName="kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251786 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251854 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config" seLinuxMountContext="" Mar 19 09:19:42.251888 master-0 kubenswrapper[7518]: I0319 09:19:42.251868 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.251996 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="872e5f8c-b014-4283-a4d2-0e2cfd29e192" volumeName="kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252019 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252103 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a75049de-dcf1-4102-b339-f45d5015adea" volumeName="kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252127 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d" volumeName="kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252148 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252173 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" volumeName="kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252187 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252207 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client" seLinuxMountContext="" Mar 19 09:19:42.252259 master-0 kubenswrapper[7518]: I0319 09:19:42.252253 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 09:19:42.252588 master-0 kubenswrapper[7518]: I0319 09:19:42.252387 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e09725c2-45c6-4a60-b817-6e5316d6f8e8" volumeName="kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj" seLinuxMountContext="" Mar 19 09:19:42.252588 master-0 kubenswrapper[7518]: I0319 09:19:42.252412 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33e92e5d-61ea-45b2-b357-ebffdaebf4af" volumeName="kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 09:19:42.252588 master-0 kubenswrapper[7518]: I0319 09:19:42.252430 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" volumeName="kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access" seLinuxMountContext="" Mar 19 09:19:42.252588 master-0 kubenswrapper[7518]: I0319 09:19:42.252454 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b36f3b2-caf9-40ad-a3a1-e83796142f54" volumeName="kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config" seLinuxMountContext="" Mar 19 09:19:42.252588 master-0 kubenswrapper[7518]: I0319 09:19:42.252536 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b36f3b2-caf9-40ad-a3a1-e83796142f54" volumeName="kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj" seLinuxMountContext="" Mar 19 09:19:42.252772 master-0 kubenswrapper[7518]: I0319 09:19:42.252579 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca" seLinuxMountContext="" Mar 19 09:19:42.252941 master-0 kubenswrapper[7518]: I0319 09:19:42.252829 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides" seLinuxMountContext="" Mar 19 09:19:42.253082 master-0 kubenswrapper[7518]: I0319 09:19:42.253043 7518 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="310d604b-fe9a-4b19-b8b5-7a1983e45e67" volumeName="kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config" seLinuxMountContext="" Mar 19 09:19:42.253146 master-0 kubenswrapper[7518]: I0319 09:19:42.253083 7518 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:19:42.253146 master-0 kubenswrapper[7518]: I0319 09:19:42.253095 7518 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:19:42.312507 master-0 kubenswrapper[7518]: I0319 09:19:42.312320 7518 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:19:42.314463 master-0 kubenswrapper[7518]: I0319 09:19:42.314432 7518 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:19:42.314528 master-0 kubenswrapper[7518]: I0319 09:19:42.314495 7518 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:19:42.314566 master-0 kubenswrapper[7518]: I0319 09:19:42.314534 7518 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:19:42.314725 master-0 kubenswrapper[7518]: E0319 09:19:42.314682 7518 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 09:19:42.316636 master-0 kubenswrapper[7518]: I0319 09:19:42.316593 7518 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:19:42.320591 master-0 kubenswrapper[7518]: I0319 09:19:42.320530 7518 generic.go:334] "Generic (PLEG): container finished" podID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerID="9b28c300e3439abe307f50e88ba8ce2d925b14966bafd61f93ba6a56066cd1f7" exitCode=0 Mar 19 09:19:42.324808 master-0 kubenswrapper[7518]: I0319 09:19:42.324775 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svct_872e5f8c-b014-4283-a4d2-0e2cfd29e192/kube-multus/0.log" Mar 19 09:19:42.324867 master-0 kubenswrapper[7518]: I0319 09:19:42.324826 7518 generic.go:334] "Generic (PLEG): container finished" podID="872e5f8c-b014-4283-a4d2-0e2cfd29e192" containerID="b504737085975340ca235cec0c4c9e74b2eb5d8b9a50455476ac176eb78b4a5c" exitCode=1 Mar 19 09:19:42.326749 master-0 kubenswrapper[7518]: I0319 09:19:42.326715 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:19:42.327350 master-0 kubenswrapper[7518]: I0319 09:19:42.327307 7518 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11" exitCode=1 Mar 19 09:19:42.327408 master-0 kubenswrapper[7518]: I0319 09:19:42.327346 7518 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="c218293403aa861a38085870e890bceedfe5394df8d5e259c54d305af3fdeae9" exitCode=0 Mar 19 09:19:42.332294 master-0 kubenswrapper[7518]: I0319 09:19:42.332094 7518 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266" exitCode=1 Mar 19 09:19:42.338804 master-0 kubenswrapper[7518]: I0319 09:19:42.338765 7518 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="a84c1f34c626f1387c9440e1656352bf22e178dc307b15faa17e2d14af155731" exitCode=0 Mar 19 09:19:42.343978 master-0 kubenswrapper[7518]: I0319 09:19:42.343941 7518 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="dee05648403cf8d6ee35acba18e21f4c87a759e5c8fc08c0570622f3df3f33e1" exitCode=0 Mar 19 09:19:42.343978 master-0 kubenswrapper[7518]: I0319 09:19:42.343966 7518 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="a5f501670eb3ea46a2e9833a8efe0358489fe82196edec8a883f420d084aeb16" exitCode=0 Mar 19 09:19:42.343978 master-0 kubenswrapper[7518]: I0319 09:19:42.343977 7518 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="b1fd1a1a09332960aaf03f0be319bfd31ad0e612d2387b20f773844856dcefe5" exitCode=0 Mar 19 09:19:42.344162 master-0 kubenswrapper[7518]: I0319 09:19:42.343988 7518 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="665177f0301e1fc60d7ae832223fecb7c16c65e7cc5cfa86a5c6a63e7efdc407" exitCode=0 Mar 19 09:19:42.344162 master-0 kubenswrapper[7518]: I0319 09:19:42.343997 7518 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="7335f4e870393336ecca59a320d7b43e9c33ca895a7a0816d7e753f6c020f7af" exitCode=0 Mar 19 09:19:42.344162 master-0 kubenswrapper[7518]: I0319 09:19:42.344008 7518 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="a2f44163a580069fe9b4a06584e3e0baeea817a9f7b28d2b1b8dc2d50f42ba8a" exitCode=0 Mar 19 09:19:42.367775 master-0 kubenswrapper[7518]: I0319 09:19:42.367714 7518 generic.go:334] "Generic (PLEG): container finished" podID="96902651-8e2b-44c2-be80-0a8c7c28cb58" containerID="df60facd7b253794e244b5462531d7a854ab92c89e6e7a5b56683d4b99824cfc" exitCode=0 Mar 19 09:19:42.369243 master-0 kubenswrapper[7518]: I0319 09:19:42.369207 7518 generic.go:334] "Generic (PLEG): container finished" podID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerID="ddf97e1b992b687ae1658f8b5cc4c1c01ae45509b7aaa2768e80614c358636c9" exitCode=0 Mar 19 09:19:42.377483 master-0 kubenswrapper[7518]: I0319 09:19:42.377442 7518 manager.go:324] Recovery completed Mar 19 09:19:42.414814 master-0 kubenswrapper[7518]: E0319 09:19:42.414754 7518 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:19:42.415913 master-0 kubenswrapper[7518]: I0319 09:19:42.415885 7518 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:19:42.415913 master-0 kubenswrapper[7518]: I0319 09:19:42.415903 7518 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:19:42.416000 master-0 kubenswrapper[7518]: I0319 09:19:42.415920 7518 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:19:42.416086 master-0 kubenswrapper[7518]: I0319 09:19:42.416065 7518 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 09:19:42.416116 master-0 kubenswrapper[7518]: I0319 09:19:42.416080 7518 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 09:19:42.416116 master-0 kubenswrapper[7518]: I0319 09:19:42.416099 7518 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 09:19:42.416116 master-0 kubenswrapper[7518]: I0319 09:19:42.416105 7518 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 09:19:42.416116 master-0 kubenswrapper[7518]: I0319 09:19:42.416112 7518 policy_none.go:49] "None policy: Start" Mar 19 09:19:42.417415 master-0 kubenswrapper[7518]: I0319 09:19:42.417377 7518 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:19:42.417478 master-0 kubenswrapper[7518]: I0319 09:19:42.417421 7518 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:19:42.417674 master-0 kubenswrapper[7518]: I0319 09:19:42.417651 7518 state_mem.go:75] "Updated machine memory state" Mar 19 09:19:42.417674 master-0 kubenswrapper[7518]: I0319 09:19:42.417669 7518 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 09:19:42.425711 master-0 kubenswrapper[7518]: I0319 09:19:42.425648 7518 manager.go:334] "Starting Device Plugin manager" Mar 19 09:19:42.425711 master-0 kubenswrapper[7518]: I0319 09:19:42.425682 7518 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:19:42.425711 master-0 kubenswrapper[7518]: I0319 09:19:42.425694 7518 server.go:79] "Starting device plugin registration server" Mar 19 09:19:42.426197 master-0 kubenswrapper[7518]: I0319 09:19:42.426071 7518 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:19:42.426374 master-0 kubenswrapper[7518]: I0319 09:19:42.426193 7518 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:19:42.426516 master-0 kubenswrapper[7518]: I0319 09:19:42.426444 7518 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:19:42.426572 master-0 kubenswrapper[7518]: I0319 09:19:42.426556 7518 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:19:42.426572 master-0 kubenswrapper[7518]: I0319 09:19:42.426566 7518 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:19:42.436324 master-0 kubenswrapper[7518]: I0319 09:19:42.435725 7518 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 09:19:42.527654 master-0 kubenswrapper[7518]: I0319 09:19:42.527587 7518 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:19:42.529322 master-0 kubenswrapper[7518]: I0319 09:19:42.529291 7518 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:19:42.529401 master-0 kubenswrapper[7518]: I0319 09:19:42.529332 7518 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:19:42.529401 master-0 kubenswrapper[7518]: I0319 09:19:42.529346 7518 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:19:42.529495 master-0 kubenswrapper[7518]: I0319 09:19:42.529404 7518 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:19:42.616021 master-0 kubenswrapper[7518]: I0319 09:19:42.615864 7518 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:19:42.616701 master-0 kubenswrapper[7518]: I0319 09:19:42.616673 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26" Mar 19 09:19:42.616775 master-0 kubenswrapper[7518]: I0319 09:19:42.616709 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"e046e1ab5ed34b841248a951c60543dfca2a668c2cdbbcdc17996eec0b9a0bfb"} Mar 19 09:19:42.616858 master-0 kubenswrapper[7518]: I0319 09:19:42.616780 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11"} Mar 19 09:19:42.616894 master-0 kubenswrapper[7518]: I0319 09:19:42.616865 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"c218293403aa861a38085870e890bceedfe5394df8d5e259c54d305af3fdeae9"} Mar 19 09:19:42.616894 master-0 kubenswrapper[7518]: I0319 09:19:42.616881 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931"} Mar 19 09:19:42.616958 master-0 kubenswrapper[7518]: I0319 09:19:42.616893 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"f432083e0bbefbf0b796c955a8b8a3248de20b6a5a5f87ee1ff2f03234e367ae"} Mar 19 09:19:42.616995 master-0 kubenswrapper[7518]: I0319 09:19:42.616968 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a7909254e1fd575ef7a679770eb6617922c50b1fbb682ef07075bcdacdc5e021"} Mar 19 09:19:42.616995 master-0 kubenswrapper[7518]: I0319 09:19:42.616980 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266"} Mar 19 09:19:42.617070 master-0 kubenswrapper[7518]: I0319 09:19:42.616992 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede"} Mar 19 09:19:42.617070 master-0 kubenswrapper[7518]: I0319 09:19:42.617062 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a"} Mar 19 09:19:42.617123 master-0 kubenswrapper[7518]: I0319 09:19:42.617080 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b"} Mar 19 09:19:42.617123 master-0 kubenswrapper[7518]: I0319 09:19:42.617090 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"b6127c72d8c53bf1b1380c8dcc76cc1a5d87ba8b34b442688dfdbcaa98f87386"} Mar 19 09:19:42.617189 master-0 kubenswrapper[7518]: I0319 09:19:42.617163 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"157ec68d28f9ad49e7460cf4325702e32a61a87e98a342a6b3f00e830966c9b0"} Mar 19 09:19:42.617231 master-0 kubenswrapper[7518]: I0319 09:19:42.617189 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"7716e20f21898d48a97cdc11ca530decd4b56cabb9557337c593d6dc0a3abe47"} Mar 19 09:19:42.617231 master-0 kubenswrapper[7518]: I0319 09:19:42.617203 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"a84c1f34c626f1387c9440e1656352bf22e178dc307b15faa17e2d14af155731"} Mar 19 09:19:42.617231 master-0 kubenswrapper[7518]: I0319 09:19:42.617220 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334"} Mar 19 09:19:42.617310 master-0 kubenswrapper[7518]: I0319 09:19:42.617256 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c283b976ccff2f081d129aad2281421561a14a7be4a6f3749d2de0cb2ccb0b0b" Mar 19 09:19:42.617310 master-0 kubenswrapper[7518]: I0319 09:19:42.617265 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d"} Mar 19 09:19:42.617310 master-0 kubenswrapper[7518]: I0319 09:19:42.617275 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15"} Mar 19 09:19:42.617310 master-0 kubenswrapper[7518]: I0319 09:19:42.617303 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9589bbab032e262b4d7aedeb656ab180a0c26f2d3e71118ea25c48ac0d07f6bd" Mar 19 09:19:42.642140 master-0 kubenswrapper[7518]: I0319 09:19:42.642086 7518 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 09:19:42.642357 master-0 kubenswrapper[7518]: I0319 09:19:42.642187 7518 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 09:19:42.738010 master-0 kubenswrapper[7518]: I0319 09:19:42.737938 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.738010 master-0 kubenswrapper[7518]: I0319 09:19:42.737991 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738099 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738189 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738223 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738248 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738272 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738296 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738319 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738343 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738367 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.738416 master-0 kubenswrapper[7518]: I0319 09:19:42.738398 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.738873 master-0 kubenswrapper[7518]: I0319 09:19:42.738446 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.739062 master-0 kubenswrapper[7518]: I0319 09:19:42.739021 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.739127 master-0 kubenswrapper[7518]: I0319 09:19:42.739077 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:42.739176 master-0 kubenswrapper[7518]: I0319 09:19:42.739103 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.739227 master-0 kubenswrapper[7518]: I0319 09:19:42.739172 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.840133 master-0 kubenswrapper[7518]: I0319 09:19:42.840022 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.840383 master-0 kubenswrapper[7518]: I0319 09:19:42.840075 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.840383 master-0 kubenswrapper[7518]: I0319 09:19:42.840220 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.840383 master-0 kubenswrapper[7518]: I0319 09:19:42.840251 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:42.840383 master-0 kubenswrapper[7518]: I0319 09:19:42.840271 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.840545 master-0 kubenswrapper[7518]: I0319 09:19:42.840363 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.840545 master-0 kubenswrapper[7518]: I0319 09:19:42.840424 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.840545 master-0 kubenswrapper[7518]: I0319 09:19:42.840455 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.840933 master-0 kubenswrapper[7518]: I0319 09:19:42.840647 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.840933 master-0 kubenswrapper[7518]: I0319 09:19:42.840927 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841020 master-0 kubenswrapper[7518]: I0319 09:19:42.840904 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841020 master-0 kubenswrapper[7518]: I0319 09:19:42.840883 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:42.841086 master-0 kubenswrapper[7518]: I0319 09:19:42.841028 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:42.841120 master-0 kubenswrapper[7518]: I0319 09:19:42.841100 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:42.841120 master-0 kubenswrapper[7518]: I0319 09:19:42.841098 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841200 master-0 kubenswrapper[7518]: I0319 09:19:42.841127 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841200 master-0 kubenswrapper[7518]: I0319 09:19:42.841145 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841200 master-0 kubenswrapper[7518]: I0319 09:19:42.841181 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841200 master-0 kubenswrapper[7518]: I0319 09:19:42.841179 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.841334 master-0 kubenswrapper[7518]: I0319 09:19:42.841239 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.841334 master-0 kubenswrapper[7518]: I0319 09:19:42.841267 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.841334 master-0 kubenswrapper[7518]: I0319 09:19:42.841298 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:42.841334 master-0 kubenswrapper[7518]: I0319 09:19:42.841331 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841343 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841205 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841360 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841398 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841400 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841445 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:42.841458 master-0 kubenswrapper[7518]: I0319 09:19:42.841447 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:42.841707 master-0 kubenswrapper[7518]: I0319 09:19:42.841500 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:42.841707 master-0 kubenswrapper[7518]: I0319 09:19:42.841523 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:42.841707 master-0 kubenswrapper[7518]: I0319 09:19:42.841419 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:42.841707 master-0 kubenswrapper[7518]: I0319 09:19:42.841562 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:43.065542 master-0 kubenswrapper[7518]: E0319 09:19:43.064817 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:43.065542 master-0 kubenswrapper[7518]: E0319 09:19:43.065363 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:43.066052 master-0 kubenswrapper[7518]: E0319 09:19:43.065982 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:43.066133 master-0 kubenswrapper[7518]: W0319 09:19:43.066078 7518 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 09:19:43.066194 master-0 kubenswrapper[7518]: E0319 09:19:43.066178 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:43.066373 master-0 kubenswrapper[7518]: E0319 09:19:43.066338 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:43.223324 master-0 kubenswrapper[7518]: I0319 09:19:43.223210 7518 apiserver.go:52] "Watching apiserver" Mar 19 09:19:43.234331 master-0 kubenswrapper[7518]: I0319 09:19:43.234229 7518 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:19:43.235056 master-0 kubenswrapper[7518]: I0319 09:19:43.234991 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb","openshift-dns-operator/dns-operator-9c5679d8f-fdxtp","openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd","openshift-ovn-kubernetes/ovnkube-node-fwjzr","assisted-installer/assisted-installer-controller-gn85g","kube-system/bootstrap-kube-controller-manager-master-0","openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2","openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz","openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv","kube-system/bootstrap-kube-scheduler-master-0","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6","openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc","openshift-network-operator/iptables-alerter-2s58d","openshift-network-operator/network-operator-7bd846bfc4-jxvxl","openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq","openshift-etcd/etcd-master-0-master-0","openshift-network-node-identity/network-node-identity-kqb2h","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm","openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498","openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869","openshift-multus/multus-8svct","openshift-network-diagnostics/network-check-target-95w9b","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw","openshift-multus/multus-additional-cni-plugins-tjzdb","openshift-multus/network-metrics-daemon-p76jz","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/marketplace-operator-89ccd998f-6qck2"] Mar 19 09:19:43.235260 master-0 kubenswrapper[7518]: I0319 09:19:43.235228 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:19:43.235377 master-0 kubenswrapper[7518]: I0319 09:19:43.235341 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:43.235552 master-0 kubenswrapper[7518]: I0319 09:19:43.235518 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.236133 master-0 kubenswrapper[7518]: I0319 09:19:43.236095 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.236219 master-0 kubenswrapper[7518]: I0319 09:19:43.236179 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.236359 master-0 kubenswrapper[7518]: I0319 09:19:43.236322 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:43.236981 master-0 kubenswrapper[7518]: I0319 09:19:43.236947 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.237188 master-0 kubenswrapper[7518]: I0319 09:19:43.237155 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:43.237249 master-0 kubenswrapper[7518]: I0319 09:19:43.237227 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:43.237916 master-0 kubenswrapper[7518]: I0319 09:19:43.237870 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.238675 master-0 kubenswrapper[7518]: I0319 09:19:43.238636 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.240155 master-0 kubenswrapper[7518]: I0319 09:19:43.240089 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:43.240288 master-0 kubenswrapper[7518]: I0319 09:19:43.240248 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:43.241210 master-0 kubenswrapper[7518]: I0319 09:19:43.241175 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:43.241941 master-0 kubenswrapper[7518]: I0319 09:19:43.241900 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:19:43.242127 master-0 kubenswrapper[7518]: I0319 09:19:43.241904 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:19:43.242187 master-0 kubenswrapper[7518]: I0319 09:19:43.242096 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.242565 master-0 kubenswrapper[7518]: I0319 09:19:43.242184 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:19:43.245681 master-0 kubenswrapper[7518]: I0319 09:19:43.245583 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:19:43.246345 master-0 kubenswrapper[7518]: I0319 09:19:43.246205 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:19:43.247665 master-0 kubenswrapper[7518]: I0319 09:19:43.246461 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:19:43.247665 master-0 kubenswrapper[7518]: I0319 09:19:43.246713 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.247665 master-0 kubenswrapper[7518]: I0319 09:19:43.247176 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:19:43.247821 master-0 kubenswrapper[7518]: I0319 09:19:43.247791 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.250172 master-0 kubenswrapper[7518]: I0319 09:19:43.248071 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:19:43.250172 master-0 kubenswrapper[7518]: I0319 09:19:43.250154 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:19:43.250303 master-0 kubenswrapper[7518]: I0319 09:19:43.250293 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:19:43.250367 master-0 kubenswrapper[7518]: I0319 09:19:43.250330 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:19:43.250534 master-0 kubenswrapper[7518]: I0319 09:19:43.250436 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.251522 master-0 kubenswrapper[7518]: I0319 09:19:43.250581 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:19:43.251522 master-0 kubenswrapper[7518]: I0319 09:19:43.250583 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.251522 master-0 kubenswrapper[7518]: I0319 09:19:43.250650 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:19:43.251522 master-0 kubenswrapper[7518]: I0319 09:19:43.250780 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:19:43.251522 master-0 kubenswrapper[7518]: I0319 09:19:43.250787 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:19:43.255196 master-0 kubenswrapper[7518]: I0319 09:19:43.254902 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.255311 master-0 kubenswrapper[7518]: I0319 09:19:43.255276 7518 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 09:19:43.255929 master-0 kubenswrapper[7518]: I0319 09:19:43.255795 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:19:43.255929 master-0 kubenswrapper[7518]: I0319 09:19:43.255868 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:19:43.256204 master-0 kubenswrapper[7518]: I0319 09:19:43.255998 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:19:43.256204 master-0 kubenswrapper[7518]: I0319 09:19:43.256074 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:19:43.256204 master-0 kubenswrapper[7518]: I0319 09:19:43.256102 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:19:43.256896 master-0 kubenswrapper[7518]: I0319 09:19:43.256353 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.256896 master-0 kubenswrapper[7518]: I0319 09:19:43.256409 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:19:43.256896 master-0 kubenswrapper[7518]: I0319 09:19:43.256593 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:19:43.256896 master-0 kubenswrapper[7518]: I0319 09:19:43.256632 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:19:43.256896 master-0 kubenswrapper[7518]: I0319 09:19:43.256699 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:19:43.256896 master-0 kubenswrapper[7518]: I0319 09:19:43.256892 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257037 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257128 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257199 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257305 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257355 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257426 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:19:43.257431 master-0 kubenswrapper[7518]: I0319 09:19:43.257435 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:19:43.257710 master-0 kubenswrapper[7518]: I0319 09:19:43.257590 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:19:43.257710 master-0 kubenswrapper[7518]: I0319 09:19:43.257639 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:19:43.257796 master-0 kubenswrapper[7518]: I0319 09:19:43.257753 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:19:43.257861 master-0 kubenswrapper[7518]: I0319 09:19:43.257839 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:19:43.258239 master-0 kubenswrapper[7518]: I0319 09:19:43.257921 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:19:43.258239 master-0 kubenswrapper[7518]: I0319 09:19:43.258073 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:19:43.258239 master-0 kubenswrapper[7518]: I0319 09:19:43.258125 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:19:43.258239 master-0 kubenswrapper[7518]: I0319 09:19:43.258179 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:19:43.258239 master-0 kubenswrapper[7518]: I0319 09:19:43.258074 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:19:43.258502 master-0 kubenswrapper[7518]: I0319 09:19:43.258261 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:19:43.258502 master-0 kubenswrapper[7518]: I0319 09:19:43.257149 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:19:43.258502 master-0 kubenswrapper[7518]: I0319 09:19:43.258356 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:19:43.258502 master-0 kubenswrapper[7518]: I0319 09:19:43.258387 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:19:43.258502 master-0 kubenswrapper[7518]: I0319 09:19:43.258500 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:19:43.258711 master-0 kubenswrapper[7518]: I0319 09:19:43.258686 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.258689 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.258833 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.258973 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.259017 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.259079 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.259549 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.259591 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.259741 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.259967 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.260724 master-0 kubenswrapper[7518]: I0319 09:19:43.260178 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:19:43.261136 master-0 kubenswrapper[7518]: I0319 09:19:43.261060 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:19:43.261382 master-0 kubenswrapper[7518]: I0319 09:19:43.261232 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:19:43.261382 master-0 kubenswrapper[7518]: I0319 09:19:43.261250 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:19:43.261659 master-0 kubenswrapper[7518]: I0319 09:19:43.261633 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:19:43.262517 master-0 kubenswrapper[7518]: I0319 09:19:43.262461 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.262703 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263012 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263022 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263183 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263253 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263376 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263494 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263515 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263615 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263620 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263622 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263641 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263649 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.263627 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.264026 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.264043 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.264064 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:19:43.264059 master-0 kubenswrapper[7518]: I0319 09:19:43.264067 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:19:43.264819 master-0 kubenswrapper[7518]: I0319 09:19:43.264602 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:19:43.264819 master-0 kubenswrapper[7518]: I0319 09:19:43.264705 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:19:43.264819 master-0 kubenswrapper[7518]: I0319 09:19:43.264774 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.264936 master-0 kubenswrapper[7518]: I0319 09:19:43.264874 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:19:43.265262 master-0 kubenswrapper[7518]: I0319 09:19:43.265220 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:19:43.265356 master-0 kubenswrapper[7518]: I0319 09:19:43.265329 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:19:43.265598 master-0 kubenswrapper[7518]: I0319 09:19:43.265553 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:19:43.273581 master-0 kubenswrapper[7518]: I0319 09:19:43.271493 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:19:43.274459 master-0 kubenswrapper[7518]: I0319 09:19:43.274415 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:19:43.275389 master-0 kubenswrapper[7518]: I0319 09:19:43.275300 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:19:43.275933 master-0 kubenswrapper[7518]: I0319 09:19:43.275884 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:19:43.277018 master-0 kubenswrapper[7518]: I0319 09:19:43.276974 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:19:43.291800 master-0 kubenswrapper[7518]: I0319 09:19:43.288006 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:19:43.291800 master-0 kubenswrapper[7518]: I0319 09:19:43.288018 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:19:43.305895 master-0 kubenswrapper[7518]: I0319 09:19:43.305842 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:19:43.326531 master-0 kubenswrapper[7518]: I0319 09:19:43.326373 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:19:43.345939 master-0 kubenswrapper[7518]: I0319 09:19:43.345862 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:19:43.348835 master-0 kubenswrapper[7518]: I0319 09:19:43.348794 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.348892 master-0 kubenswrapper[7518]: I0319 09:19:43.348835 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.348892 master-0 kubenswrapper[7518]: I0319 09:19:43.348857 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:43.348983 master-0 kubenswrapper[7518]: I0319 09:19:43.348889 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:43.349240 master-0 kubenswrapper[7518]: I0319 09:19:43.349192 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.349276 master-0 kubenswrapper[7518]: I0319 09:19:43.349237 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:43.349276 master-0 kubenswrapper[7518]: I0319 09:19:43.349258 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:43.349335 master-0 kubenswrapper[7518]: I0319 09:19:43.349279 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.349335 master-0 kubenswrapper[7518]: I0319 09:19:43.349297 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.349335 master-0 kubenswrapper[7518]: I0319 09:19:43.349315 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.349457 master-0 kubenswrapper[7518]: I0319 09:19:43.349432 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.349516 master-0 kubenswrapper[7518]: I0319 09:19:43.349453 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.349516 master-0 kubenswrapper[7518]: I0319 09:19:43.349501 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.349572 master-0 kubenswrapper[7518]: I0319 09:19:43.349523 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.349572 master-0 kubenswrapper[7518]: I0319 09:19:43.349544 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.349572 master-0 kubenswrapper[7518]: I0319 09:19:43.349562 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.349661 master-0 kubenswrapper[7518]: I0319 09:19:43.349572 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.349661 master-0 kubenswrapper[7518]: I0319 09:19:43.349581 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.349718 master-0 kubenswrapper[7518]: I0319 09:19:43.349665 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.349755 master-0 kubenswrapper[7518]: I0319 09:19:43.349692 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.349801 master-0 kubenswrapper[7518]: I0319 09:19:43.349777 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:43.349880 master-0 kubenswrapper[7518]: I0319 09:19:43.349824 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.349919 master-0 kubenswrapper[7518]: I0319 09:19:43.349878 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.349919 master-0 kubenswrapper[7518]: I0319 09:19:43.349911 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.349989 master-0 kubenswrapper[7518]: I0319 09:19:43.349946 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:43.349989 master-0 kubenswrapper[7518]: I0319 09:19:43.349913 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.349989 master-0 kubenswrapper[7518]: I0319 09:19:43.349975 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.350083 master-0 kubenswrapper[7518]: I0319 09:19:43.350030 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:43.350083 master-0 kubenswrapper[7518]: I0319 09:19:43.350059 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.350083 master-0 kubenswrapper[7518]: I0319 09:19:43.350065 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.350171 master-0 kubenswrapper[7518]: I0319 09:19:43.350067 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.350171 master-0 kubenswrapper[7518]: I0319 09:19:43.350109 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.350171 master-0 kubenswrapper[7518]: I0319 09:19:43.350130 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.350171 master-0 kubenswrapper[7518]: I0319 09:19:43.350147 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350169 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350194 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350210 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350228 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350229 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350247 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:43.350317 master-0 kubenswrapper[7518]: I0319 09:19:43.350300 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350319 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350340 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350362 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350378 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350398 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350414 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350435 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350440 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350455 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350460 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:43.350621 master-0 kubenswrapper[7518]: I0319 09:19:43.350490 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350675 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350700 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350779 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350730 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350831 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350851 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350862 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:43.350918 master-0 kubenswrapper[7518]: I0319 09:19:43.350898 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.350934 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.350951 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.351057 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.351089 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.351116 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.351143 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.351160 master-0 kubenswrapper[7518]: I0319 09:19:43.351149 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.351368 master-0 kubenswrapper[7518]: I0319 09:19:43.351116 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.351368 master-0 kubenswrapper[7518]: I0319 09:19:43.351244 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:43.351368 master-0 kubenswrapper[7518]: I0319 09:19:43.351258 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:43.351520 master-0 kubenswrapper[7518]: I0319 09:19:43.351427 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.351520 master-0 kubenswrapper[7518]: I0319 09:19:43.351425 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.351793 master-0 kubenswrapper[7518]: I0319 09:19:43.351529 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:43.351793 master-0 kubenswrapper[7518]: I0319 09:19:43.351577 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.351793 master-0 kubenswrapper[7518]: I0319 09:19:43.351632 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.351793 master-0 kubenswrapper[7518]: I0319 09:19:43.351744 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:43.351793 master-0 kubenswrapper[7518]: I0319 09:19:43.351757 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:43.351793 master-0 kubenswrapper[7518]: I0319 09:19:43.351789 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:43.352071 master-0 kubenswrapper[7518]: I0319 09:19:43.351791 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:43.352071 master-0 kubenswrapper[7518]: I0319 09:19:43.351840 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.352071 master-0 kubenswrapper[7518]: I0319 09:19:43.351875 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.352071 master-0 kubenswrapper[7518]: I0319 09:19:43.351933 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.352071 master-0 kubenswrapper[7518]: I0319 09:19:43.351989 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.352252 master-0 kubenswrapper[7518]: I0319 09:19:43.352118 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:43.352252 master-0 kubenswrapper[7518]: I0319 09:19:43.352146 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:43.352252 master-0 kubenswrapper[7518]: I0319 09:19:43.352199 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.352351 master-0 kubenswrapper[7518]: I0319 09:19:43.352267 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:43.352381 master-0 kubenswrapper[7518]: I0319 09:19:43.352350 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.352428 master-0 kubenswrapper[7518]: I0319 09:19:43.352398 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.352510 master-0 kubenswrapper[7518]: I0319 09:19:43.352423 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:43.352510 master-0 kubenswrapper[7518]: I0319 09:19:43.352431 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.352510 master-0 kubenswrapper[7518]: I0319 09:19:43.352488 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.352618 master-0 kubenswrapper[7518]: I0319 09:19:43.352528 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.352618 master-0 kubenswrapper[7518]: I0319 09:19:43.352561 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:43.352618 master-0 kubenswrapper[7518]: I0319 09:19:43.352588 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.352707 master-0 kubenswrapper[7518]: I0319 09:19:43.352622 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.352707 master-0 kubenswrapper[7518]: I0319 09:19:43.352621 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.352707 master-0 kubenswrapper[7518]: I0319 09:19:43.352642 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.352707 master-0 kubenswrapper[7518]: I0319 09:19:43.352667 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.352707 master-0 kubenswrapper[7518]: I0319 09:19:43.352697 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352720 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352744 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352767 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352792 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352813 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352836 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.352855 master-0 kubenswrapper[7518]: I0319 09:19:43.352843 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.352862 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.352902 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.352969 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.352976 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.352992 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.353016 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.353060 master-0 kubenswrapper[7518]: I0319 09:19:43.353022 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353041 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353022 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353118 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353144 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353164 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353184 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353204 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353166 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353229 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353254 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.353284 master-0 kubenswrapper[7518]: I0319 09:19:43.353272 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353293 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353326 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353404 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353481 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353517 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353538 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353564 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353616 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353629 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.353648 master-0 kubenswrapper[7518]: I0319 09:19:43.353646 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.353661 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.353674 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.353710 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.353736 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.353782 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.353950 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.354001 master-0 kubenswrapper[7518]: I0319 09:19:43.354003 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354026 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354055 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354085 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354115 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354143 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354170 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:43.354204 master-0 kubenswrapper[7518]: I0319 09:19:43.354202 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.354544 master-0 kubenswrapper[7518]: I0319 09:19:43.354232 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.354544 master-0 kubenswrapper[7518]: I0319 09:19:43.354344 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:43.354544 master-0 kubenswrapper[7518]: I0319 09:19:43.354355 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:43.354544 master-0 kubenswrapper[7518]: I0319 09:19:43.354375 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.354544 master-0 kubenswrapper[7518]: I0319 09:19:43.354501 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354564 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354598 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354618 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354639 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354660 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354694 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:43.354734 master-0 kubenswrapper[7518]: I0319 09:19:43.354718 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.354944 master-0 kubenswrapper[7518]: I0319 09:19:43.354824 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:43.354944 master-0 kubenswrapper[7518]: I0319 09:19:43.354870 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:43.354944 master-0 kubenswrapper[7518]: I0319 09:19:43.354877 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.355037 master-0 kubenswrapper[7518]: I0319 09:19:43.354980 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:43.355083 master-0 kubenswrapper[7518]: I0319 09:19:43.355058 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:43.355142 master-0 kubenswrapper[7518]: I0319 09:19:43.355092 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.355206 master-0 kubenswrapper[7518]: I0319 09:19:43.355154 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.355237 master-0 kubenswrapper[7518]: I0319 09:19:43.355223 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.355352 master-0 kubenswrapper[7518]: I0319 09:19:43.355287 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.355352 master-0 kubenswrapper[7518]: I0319 09:19:43.355291 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:43.355352 master-0 kubenswrapper[7518]: I0319 09:19:43.355310 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:43.355479 master-0 kubenswrapper[7518]: I0319 09:19:43.355351 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.355479 master-0 kubenswrapper[7518]: I0319 09:19:43.355406 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.355479 master-0 kubenswrapper[7518]: I0319 09:19:43.355444 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.355586 master-0 kubenswrapper[7518]: I0319 09:19:43.355504 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.355586 master-0 kubenswrapper[7518]: I0319 09:19:43.355521 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.355586 master-0 kubenswrapper[7518]: I0319 09:19:43.355538 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.355675 master-0 kubenswrapper[7518]: I0319 09:19:43.355598 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:43.355737 master-0 kubenswrapper[7518]: I0319 09:19:43.355712 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:43.355782 master-0 kubenswrapper[7518]: I0319 09:19:43.355556 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:43.355814 master-0 kubenswrapper[7518]: I0319 09:19:43.355793 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.355847 master-0 kubenswrapper[7518]: I0319 09:19:43.355806 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:43.355904 master-0 kubenswrapper[7518]: I0319 09:19:43.355875 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.456767 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.456850 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.456881 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457006 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457049 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457142 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457181 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457107 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457195 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.957172209 +0000 UTC m=+1.839755648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457301 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.957258562 +0000 UTC m=+1.839841821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457356 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457409 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457618 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457638 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457657 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457677 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457686 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.957674662 +0000 UTC m=+1.840258111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457720 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457742 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457723 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457774 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457775 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457820 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457851 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457878 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.457890 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.957880547 +0000 UTC m=+1.840463966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457858 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457901 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457915 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457941 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457959 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.457984 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458002 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458011 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458036 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458064 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458067 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458066 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458093 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.958084583 +0000 UTC m=+1.840667942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458085 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458110 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458151 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458162 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458190 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.958180706 +0000 UTC m=+1.840763965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458255 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458276 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458302 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458311 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458306 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458323 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458361 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458367 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458387 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.958380401 +0000 UTC m=+1.840963660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458404 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458405 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458429 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458494 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458496 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458542 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458578 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458600 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458627 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458636 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.958628807 +0000 UTC m=+1.841212066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458662 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458672 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458706 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458714 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458731 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458775 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458814 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.458821 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.958796941 +0000 UTC m=+1.841380200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458927 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458944 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.458974 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.459007 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: E0319 09:19:43.459045 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.959034837 +0000 UTC m=+1.841618276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459010 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459063 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459094 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459126 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459156 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459188 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459192 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.798250 master-0 kubenswrapper[7518]: I0319 09:19:43.459221 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459130 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459096 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459225 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459276 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459278 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: E0319 09:19:43.459295 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459303 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459311 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: E0319 09:19:43.459328 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.959320254 +0000 UTC m=+1.841903593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459333 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459346 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459373 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459408 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459450 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459503 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459512 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459542 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459543 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459590 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: E0319 09:19:43.459605 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: E0319 09:19:43.459629 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.959621973 +0000 UTC m=+1.842205232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459684 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459713 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459731 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459818 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: E0319 09:19:43.459844 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459856 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: E0319 09:19:43.459890 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:43.959877209 +0000 UTC m=+1.842460668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459925 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:43.802192 master-0 kubenswrapper[7518]: I0319 09:19:43.459928 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:43.966130 master-0 kubenswrapper[7518]: I0319 09:19:43.966067 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:43.966314 master-0 kubenswrapper[7518]: I0319 09:19:43.966152 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:43.966380 master-0 kubenswrapper[7518]: E0319 09:19:43.966307 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:43.966460 master-0 kubenswrapper[7518]: E0319 09:19:43.966307 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:43.966538 master-0 kubenswrapper[7518]: E0319 09:19:43.966487 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.966409923 +0000 UTC m=+2.848993372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:43.966619 master-0 kubenswrapper[7518]: E0319 09:19:43.966580 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.966565476 +0000 UTC m=+2.849148735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:43.966619 master-0 kubenswrapper[7518]: I0319 09:19:43.966582 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:43.966691 master-0 kubenswrapper[7518]: E0319 09:19:43.966639 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:43.966691 master-0 kubenswrapper[7518]: E0319 09:19:43.966674 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.966664089 +0000 UTC m=+2.849247538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:43.966691 master-0 kubenswrapper[7518]: I0319 09:19:43.966638 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:43.966779 master-0 kubenswrapper[7518]: E0319 09:19:43.966696 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:43.966779 master-0 kubenswrapper[7518]: E0319 09:19:43.966729 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.96671891 +0000 UTC m=+2.849302169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:43.966836 master-0 kubenswrapper[7518]: I0319 09:19:43.966795 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:43.966869 master-0 kubenswrapper[7518]: I0319 09:19:43.966850 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:43.966951 master-0 kubenswrapper[7518]: I0319 09:19:43.966914 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.967001 master-0 kubenswrapper[7518]: E0319 09:19:43.966924 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:19:43.967035 master-0 kubenswrapper[7518]: E0319 09:19:43.967017 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967007847 +0000 UTC m=+2.849591107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:19:43.967035 master-0 kubenswrapper[7518]: E0319 09:19:43.967024 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:43.967035 master-0 kubenswrapper[7518]: E0319 09:19:43.966970 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:43.967140 master-0 kubenswrapper[7518]: E0319 09:19:43.967059 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967047788 +0000 UTC m=+2.849631248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:43.967140 master-0 kubenswrapper[7518]: E0319 09:19:43.967075 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967068309 +0000 UTC m=+2.849651788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:43.967140 master-0 kubenswrapper[7518]: I0319 09:19:43.967095 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:43.967267 master-0 kubenswrapper[7518]: I0319 09:19:43.967147 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:43.967267 master-0 kubenswrapper[7518]: I0319 09:19:43.967202 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:43.967267 master-0 kubenswrapper[7518]: I0319 09:19:43.967257 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:43.967365 master-0 kubenswrapper[7518]: I0319 09:19:43.967343 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:43.967443 master-0 kubenswrapper[7518]: I0319 09:19:43.967418 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:43.967585 master-0 kubenswrapper[7518]: E0319 09:19:43.967543 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:43.967585 master-0 kubenswrapper[7518]: E0319 09:19:43.967574 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967565091 +0000 UTC m=+2.850148550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:43.967655 master-0 kubenswrapper[7518]: E0319 09:19:43.967621 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:43.967655 master-0 kubenswrapper[7518]: E0319 09:19:43.967647 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967639333 +0000 UTC m=+2.850222592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:43.967751 master-0 kubenswrapper[7518]: E0319 09:19:43.967701 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:43.967751 master-0 kubenswrapper[7518]: E0319 09:19:43.967702 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:43.967751 master-0 kubenswrapper[7518]: E0319 09:19:43.967727 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967719946 +0000 UTC m=+2.850303205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:43.967751 master-0 kubenswrapper[7518]: E0319 09:19:43.967744 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967733947 +0000 UTC m=+2.850317206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:43.967901 master-0 kubenswrapper[7518]: E0319 09:19:43.967809 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:43.967901 master-0 kubenswrapper[7518]: E0319 09:19:43.967886 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:43.967969 master-0 kubenswrapper[7518]: E0319 09:19:43.967957 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967925421 +0000 UTC m=+2.850508840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:43.968016 master-0 kubenswrapper[7518]: E0319 09:19:43.967992 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:44.967980183 +0000 UTC m=+2.850563662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:44.383230 master-0 kubenswrapper[7518]: I0319 09:19:44.383175 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:44.984458 master-0 kubenswrapper[7518]: I0319 09:19:44.984393 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:44.984873 master-0 kubenswrapper[7518]: E0319 09:19:44.984842 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:44.985088 master-0 kubenswrapper[7518]: I0319 09:19:44.985025 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:44.985173 master-0 kubenswrapper[7518]: E0319 09:19:44.985113 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:44.985173 master-0 kubenswrapper[7518]: I0319 09:19:44.985149 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:44.985173 master-0 kubenswrapper[7518]: E0319 09:19:44.985171 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.985153704 +0000 UTC m=+4.867736963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:44.985274 master-0 kubenswrapper[7518]: E0319 09:19:44.985222 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.985209136 +0000 UTC m=+4.867792465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:44.985330 master-0 kubenswrapper[7518]: E0319 09:19:44.985303 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:44.985410 master-0 kubenswrapper[7518]: E0319 09:19:44.985384 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.98536206 +0000 UTC m=+4.867945349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:44.985494 master-0 kubenswrapper[7518]: I0319 09:19:44.985446 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:44.985545 master-0 kubenswrapper[7518]: I0319 09:19:44.985518 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:44.985578 master-0 kubenswrapper[7518]: E0319 09:19:44.985557 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:44.985609 master-0 kubenswrapper[7518]: E0319 09:19:44.985585 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.985577225 +0000 UTC m=+4.868160484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:44.985680 master-0 kubenswrapper[7518]: E0319 09:19:44.985649 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:44.985752 master-0 kubenswrapper[7518]: E0319 09:19:44.985729 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.985708059 +0000 UTC m=+4.868291358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:44.985824 master-0 kubenswrapper[7518]: I0319 09:19:44.985794 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:44.985948 master-0 kubenswrapper[7518]: E0319 09:19:44.985933 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:19:44.986045 master-0 kubenswrapper[7518]: E0319 09:19:44.986035 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.986012037 +0000 UTC m=+4.868595366 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:19:44.986183 master-0 kubenswrapper[7518]: I0319 09:19:44.986162 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:44.986308 master-0 kubenswrapper[7518]: E0319 09:19:44.986283 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:44.986350 master-0 kubenswrapper[7518]: I0319 09:19:44.986282 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:44.986399 master-0 kubenswrapper[7518]: E0319 09:19:44.986338 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.986327645 +0000 UTC m=+4.868910984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:44.986535 master-0 kubenswrapper[7518]: E0319 09:19:44.986519 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:44.986642 master-0 kubenswrapper[7518]: E0319 09:19:44.986632 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.986623052 +0000 UTC m=+4.869206391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:44.986700 master-0 kubenswrapper[7518]: I0319 09:19:44.986636 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:44.986815 master-0 kubenswrapper[7518]: I0319 09:19:44.986799 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:44.986894 master-0 kubenswrapper[7518]: E0319 09:19:44.986730 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:44.986969 master-0 kubenswrapper[7518]: E0319 09:19:44.986942 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.98691881 +0000 UTC m=+4.869502109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:44.987025 master-0 kubenswrapper[7518]: I0319 09:19:44.986993 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:44.987106 master-0 kubenswrapper[7518]: I0319 09:19:44.987073 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:44.987156 master-0 kubenswrapper[7518]: E0319 09:19:44.987116 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:44.987217 master-0 kubenswrapper[7518]: E0319 09:19:44.987204 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:44.987305 master-0 kubenswrapper[7518]: E0319 09:19:44.987206 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.987180977 +0000 UTC m=+4.869764276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:44.987360 master-0 kubenswrapper[7518]: E0319 09:19:44.987218 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:44.987396 master-0 kubenswrapper[7518]: I0319 09:19:44.987360 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:44.987427 master-0 kubenswrapper[7518]: E0319 09:19:44.987392 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.987372522 +0000 UTC m=+4.869955811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:44.987462 master-0 kubenswrapper[7518]: E0319 09:19:44.987428 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.987416613 +0000 UTC m=+4.869999902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:44.987462 master-0 kubenswrapper[7518]: E0319 09:19:44.987458 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:44.987563 master-0 kubenswrapper[7518]: E0319 09:19:44.987511 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:46.987499755 +0000 UTC m=+4.870083014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:46.410830 master-0 kubenswrapper[7518]: E0319 09:19:46.410769 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:19:46.412905 master-0 kubenswrapper[7518]: I0319 09:19:46.412875 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:19:46.413071 master-0 kubenswrapper[7518]: E0319 09:19:46.413045 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:46.413327 master-0 kubenswrapper[7518]: E0319 09:19:46.413303 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:19:46.416918 master-0 kubenswrapper[7518]: I0319 09:19:46.416892 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:19:46.417986 master-0 kubenswrapper[7518]: I0319 09:19:46.417959 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:19:46.418601 master-0 kubenswrapper[7518]: I0319 09:19:46.418575 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:19:46.422178 master-0 kubenswrapper[7518]: I0319 09:19:46.422149 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:46.426103 master-0 kubenswrapper[7518]: I0319 09:19:46.426060 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:46.429224 master-0 kubenswrapper[7518]: I0319 09:19:46.429192 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:46.432553 master-0 kubenswrapper[7518]: I0319 09:19:46.432524 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:46.434860 master-0 kubenswrapper[7518]: I0319 09:19:46.434834 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:19:46.436079 master-0 kubenswrapper[7518]: I0319 09:19:46.436051 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:46.436650 master-0 kubenswrapper[7518]: I0319 09:19:46.436626 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:19:46.437239 master-0 kubenswrapper[7518]: I0319 09:19:46.437200 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:46.441064 master-0 kubenswrapper[7518]: E0319 09:19:46.441033 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:46.441634 master-0 kubenswrapper[7518]: I0319 09:19:46.441598 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:19:46.441675 master-0 kubenswrapper[7518]: I0319 09:19:46.441605 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:19:46.441714 master-0 kubenswrapper[7518]: I0319 09:19:46.441609 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:19:46.441827 master-0 kubenswrapper[7518]: I0319 09:19:46.441769 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:46.441945 master-0 kubenswrapper[7518]: I0319 09:19:46.441912 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:19:46.443265 master-0 kubenswrapper[7518]: I0319 09:19:46.443235 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:19:46.443439 master-0 kubenswrapper[7518]: I0319 09:19:46.443404 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:19:46.443503 master-0 kubenswrapper[7518]: I0319 09:19:46.443460 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:19:46.443618 master-0 kubenswrapper[7518]: I0319 09:19:46.443593 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:46.444863 master-0 kubenswrapper[7518]: I0319 09:19:46.444835 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:46.444913 master-0 kubenswrapper[7518]: I0319 09:19:46.444832 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:19:46.447039 master-0 kubenswrapper[7518]: I0319 09:19:46.447011 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:46.447488 master-0 kubenswrapper[7518]: I0319 09:19:46.447444 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:46.448605 master-0 kubenswrapper[7518]: I0319 09:19:46.448583 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:19:46.450962 master-0 kubenswrapper[7518]: I0319 09:19:46.450937 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:46.451783 master-0 kubenswrapper[7518]: I0319 09:19:46.451754 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:19:46.453777 master-0 kubenswrapper[7518]: I0319 09:19:46.453753 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:19:46.604010 master-0 kubenswrapper[7518]: I0319 09:19:46.603958 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:46.604144 master-0 kubenswrapper[7518]: I0319 09:19:46.604062 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:19:47.012991 master-0 kubenswrapper[7518]: I0319 09:19:47.012887 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:47.013238 master-0 kubenswrapper[7518]: I0319 09:19:47.013002 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:47.013238 master-0 kubenswrapper[7518]: I0319 09:19:47.013122 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:47.013238 master-0 kubenswrapper[7518]: E0319 09:19:47.013159 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:47.013238 master-0 kubenswrapper[7518]: E0319 09:19:47.013177 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:47.013391 master-0 kubenswrapper[7518]: I0319 09:19:47.013226 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:47.013391 master-0 kubenswrapper[7518]: E0319 09:19:47.013237 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013218339 +0000 UTC m=+8.895801598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:47.013391 master-0 kubenswrapper[7518]: E0319 09:19:47.013284 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:47.013391 master-0 kubenswrapper[7518]: E0319 09:19:47.013319 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:47.013391 master-0 kubenswrapper[7518]: E0319 09:19:47.013338 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013296471 +0000 UTC m=+8.895879770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:47.013391 master-0 kubenswrapper[7518]: E0319 09:19:47.013387 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013370383 +0000 UTC m=+8.895953672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: I0319 09:19:47.013413 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: I0319 09:19:47.013450 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: E0319 09:19:47.013491 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: I0319 09:19:47.013498 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: E0319 09:19:47.013564 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013505436 +0000 UTC m=+8.896088695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: E0319 09:19:47.013611 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: E0319 09:19:47.013627 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013605089 +0000 UTC m=+8.896188338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: E0319 09:19:47.013685 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:47.013698 master-0 kubenswrapper[7518]: I0319 09:19:47.013694 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: E0319 09:19:47.013714 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013696281 +0000 UTC m=+8.896279580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: E0319 09:19:47.013741 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013727462 +0000 UTC m=+8.896310761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: E0319 09:19:47.013834 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: I0319 09:19:47.013835 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: E0319 09:19:47.013877 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.013870185 +0000 UTC m=+8.896453444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: I0319 09:19:47.013893 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: I0319 09:19:47.013934 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: I0319 09:19:47.013959 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:47.014016 master-0 kubenswrapper[7518]: E0319 09:19:47.013962 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014036 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.014004219 +0000 UTC m=+8.896587518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014056 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: I0319 09:19:47.014118 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014164 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014189 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.014168683 +0000 UTC m=+8.896752022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014190 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014205 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.014197984 +0000 UTC m=+8.896781363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014221 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.014213254 +0000 UTC m=+8.896796513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014244 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:47.014312 master-0 kubenswrapper[7518]: E0319 09:19:47.014285 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:51.014267635 +0000 UTC m=+8.896850924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:47.135565 master-0 kubenswrapper[7518]: I0319 09:19:47.135495 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:47.156833 master-0 kubenswrapper[7518]: I0319 09:19:47.156783 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:47.435049 master-0 kubenswrapper[7518]: I0319 09:19:47.434979 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:48.299961 master-0 kubenswrapper[7518]: I0319 09:19:48.299886 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:48.323852 master-0 kubenswrapper[7518]: I0319 09:19:48.323782 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:48.393391 master-0 kubenswrapper[7518]: I0319 09:19:48.393323 7518 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:19:48.393391 master-0 kubenswrapper[7518]: I0319 09:19:48.393362 7518 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:19:48.511708 master-0 kubenswrapper[7518]: I0319 09:19:48.511334 7518 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:19:48.512155 master-0 kubenswrapper[7518]: E0319 09:19:48.511895 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mvqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw_openshift-kube-storage-version-migrator-operator(a75049de-dcf1-4102-b339-f45d5015adea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 09:19:48.512155 master-0 kubenswrapper[7518]: E0319 09:19:48.511981 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-apiserver-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,Command:[cluster-openshift-apiserver-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:KUBE_APISERVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qn48v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-apiserver-operator-d65958b8-96qpx_openshift-apiserver-operator(86c4b0e4-3481-465d-b00f-022d2c58c183): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 09:19:48.513998 master-0 kubenswrapper[7518]: E0319 09:19:48.513970 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" podUID="86c4b0e4-3481-465d-b00f-022d2c58c183" Mar 19 09:19:48.514081 master-0 kubenswrapper[7518]: E0319 09:19:48.513974 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" podUID="a75049de-dcf1-4102-b339-f45d5015adea" Mar 19 09:19:49.018291 master-0 kubenswrapper[7518]: I0319 09:19:49.018222 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:49.057241 master-0 kubenswrapper[7518]: I0319 09:19:49.057187 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:49.361551 master-0 kubenswrapper[7518]: I0319 09:19:49.353505 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:49.361551 master-0 kubenswrapper[7518]: W0319 09:19:49.356281 7518 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 09:19:49.361551 master-0 kubenswrapper[7518]: E0319 09:19:49.356344 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:19:49.378796 master-0 kubenswrapper[7518]: I0319 09:19:49.378746 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:49.387806 master-0 kubenswrapper[7518]: I0319 09:19:49.387772 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:49.410083 master-0 kubenswrapper[7518]: I0319 09:19:49.410053 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:49.411273 master-0 kubenswrapper[7518]: I0319 09:19:49.411255 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:19:49.645923 master-0 kubenswrapper[7518]: I0319 09:19:49.645773 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:49.646367 master-0 kubenswrapper[7518]: I0319 09:19:49.646066 7518 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:19:49.646367 master-0 kubenswrapper[7518]: I0319 09:19:49.646086 7518 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:19:49.665558 master-0 kubenswrapper[7518]: I0319 09:19:49.665522 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:19:50.404457 master-0 kubenswrapper[7518]: I0319 09:19:50.404398 7518 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:19:51.016501 master-0 kubenswrapper[7518]: I0319 09:19:51.016413 7518 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:19:51.021710 master-0 kubenswrapper[7518]: I0319 09:19:51.021666 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:51.042175 master-0 kubenswrapper[7518]: I0319 09:19:51.042122 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:19:51.087261 master-0 kubenswrapper[7518]: I0319 09:19:51.087180 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:51.087261 master-0 kubenswrapper[7518]: I0319 09:19:51.087237 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:51.087553 master-0 kubenswrapper[7518]: E0319 09:19:51.087433 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:51.087595 master-0 kubenswrapper[7518]: E0319 09:19:51.087563 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.087530845 +0000 UTC m=+16.970114144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:51.087697 master-0 kubenswrapper[7518]: I0319 09:19:51.087629 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:51.087747 master-0 kubenswrapper[7518]: I0319 09:19:51.087696 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:51.087747 master-0 kubenswrapper[7518]: I0319 09:19:51.087741 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:51.087822 master-0 kubenswrapper[7518]: I0319 09:19:51.087778 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:51.087861 master-0 kubenswrapper[7518]: I0319 09:19:51.087828 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:51.087910 master-0 kubenswrapper[7518]: I0319 09:19:51.087865 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:51.087951 master-0 kubenswrapper[7518]: I0319 09:19:51.087921 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:51.087990 master-0 kubenswrapper[7518]: I0319 09:19:51.087959 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:51.088131 master-0 kubenswrapper[7518]: I0319 09:19:51.088057 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:51.088191 master-0 kubenswrapper[7518]: E0319 09:19:51.088178 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:51.088236 master-0 kubenswrapper[7518]: E0319 09:19:51.088193 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:51.088236 master-0 kubenswrapper[7518]: E0319 09:19:51.088222 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088208382 +0000 UTC m=+16.970791681 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:51.088322 master-0 kubenswrapper[7518]: E0319 09:19:51.088250 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088233833 +0000 UTC m=+16.970817092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:51.088322 master-0 kubenswrapper[7518]: E0319 09:19:51.088293 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:51.088322 master-0 kubenswrapper[7518]: E0319 09:19:51.088317 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088310354 +0000 UTC m=+16.970893613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:51.088442 master-0 kubenswrapper[7518]: E0319 09:19:51.088386 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:19:51.088442 master-0 kubenswrapper[7518]: E0319 09:19:51.088436 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088419647 +0000 UTC m=+16.971002976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:19:51.088575 master-0 kubenswrapper[7518]: E0319 09:19:51.088540 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:51.088618 master-0 kubenswrapper[7518]: E0319 09:19:51.088587 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088574381 +0000 UTC m=+16.971157710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:51.088674 master-0 kubenswrapper[7518]: E0319 09:19:51.088648 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:51.088714 master-0 kubenswrapper[7518]: E0319 09:19:51.088685 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088672773 +0000 UTC m=+16.971256122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:51.088762 master-0 kubenswrapper[7518]: E0319 09:19:51.088749 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:51.088801 master-0 kubenswrapper[7518]: E0319 09:19:51.088786 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.088774206 +0000 UTC m=+16.971357525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:51.088844 master-0 kubenswrapper[7518]: I0319 09:19:51.088818 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:51.088883 master-0 kubenswrapper[7518]: I0319 09:19:51.088863 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:51.088994 master-0 kubenswrapper[7518]: E0319 09:19:51.088964 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:51.089044 master-0 kubenswrapper[7518]: E0319 09:19:51.089013 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.089001193 +0000 UTC m=+16.971584502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:51.089089 master-0 kubenswrapper[7518]: E0319 09:19:51.089073 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:51.089127 master-0 kubenswrapper[7518]: E0319 09:19:51.089110 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.089098455 +0000 UTC m=+16.971681814 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:51.089188 master-0 kubenswrapper[7518]: E0319 09:19:51.089169 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:51.089234 master-0 kubenswrapper[7518]: E0319 09:19:51.089209 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.089198008 +0000 UTC m=+16.971781277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:51.089325 master-0 kubenswrapper[7518]: E0319 09:19:51.089289 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:51.089365 master-0 kubenswrapper[7518]: E0319 09:19:51.089326 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.089314231 +0000 UTC m=+16.971897560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:51.089407 master-0 kubenswrapper[7518]: E0319 09:19:51.089386 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:51.089455 master-0 kubenswrapper[7518]: E0319 09:19:51.089421 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:19:59.089408953 +0000 UTC m=+16.971992312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:51.737344 master-0 kubenswrapper[7518]: E0319 09:19:51.737229 7518 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483" Mar 19 09:19:51.737558 master-0 kubenswrapper[7518]: E0319 09:19:51.737533 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:openshift-api,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483,Command:[write-available-featuresets --asset-output-dir=/available-featuregates --payload-version=$(OPERATOR_IMAGE_VERSION)],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qh4t8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-95bf4f4d-bqqqq_openshift-config-operator(7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:19:51.738817 master-0 kubenswrapper[7518]: E0319 09:19:51.738753 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" podUID="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" Mar 19 09:19:52.531335 master-0 kubenswrapper[7518]: I0319 09:19:52.530911 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:52.535574 master-0 kubenswrapper[7518]: I0319 09:19:52.535540 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:52.720400 master-0 kubenswrapper[7518]: E0319 09:19:52.720320 7518 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458" Mar 19 09:19:52.720615 master-0 kubenswrapper[7518]: E0319 09:19:52.720556 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458,Command:[cluster-kube-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458,ValueFrom:nil,},EnvVar{Name:CLUSTER_POLICY_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:67c988e079558dc6b20232ebf9a7f7276fee60c756caed584c9715e0bec77a5a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-controller-manager-operator-ff989d6cc-rcnnp_openshift-kube-controller-manager-operator(a823c8bc-09ef-46a9-a1f3-155a34b89788): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:19:52.721824 master-0 kubenswrapper[7518]: E0319 09:19:52.721765 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" podUID="a823c8bc-09ef-46a9-a1f3-155a34b89788" Mar 19 09:19:53.159133 master-0 kubenswrapper[7518]: E0319 09:19:53.159060 7518 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263" Mar 19 09:19:53.159378 master-0 kubenswrapper[7518]: E0319 09:19:53.159279 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7k8wj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-b865698dc-wwkqz_openshift-service-ca-operator(5b36f3b2-caf9-40ad-a3a1-e83796142f54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:19:53.160559 master-0 kubenswrapper[7518]: E0319 09:19:53.160448 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" podUID="5b36f3b2-caf9-40ad-a3a1-e83796142f54" Mar 19 09:19:53.419866 master-0 kubenswrapper[7518]: I0319 09:19:53.419747 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:19:54.036679 master-0 kubenswrapper[7518]: E0319 09:19:54.036395 7518 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302" Mar 19 09:19:54.037453 master-0 kubenswrapper[7518]: E0319 09:19:54.036791 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-scheduler-operator-container,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302,Command:[cluster-kube-scheduler-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-kube-scheduler-operator-dddff6458-6fzwb_openshift-kube-scheduler-operator(62d3ca81-26e1-4625-a3aa-b1eabd31cfd6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:19:54.038747 master-0 kubenswrapper[7518]: E0319 09:19:54.038715 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" podUID="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" Mar 19 09:19:57.389522 master-0 kubenswrapper[7518]: E0319 09:19:57.389443 7518 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a" Mar 19 09:19:57.390043 master-0 kubenswrapper[7518]: E0319 09:19:57.389668 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n6zkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-8544cbcf9c-ct498_openshift-etcd-operator(9663cc40-a69d-42ba-890e-071cb85062f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:19:57.390943 master-0 kubenswrapper[7518]: E0319 09:19:57.390882 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" podUID="9663cc40-a69d-42ba-890e-071cb85062f5" Mar 19 09:19:59.176105 master-0 kubenswrapper[7518]: I0319 09:19:59.176008 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:59.176105 master-0 kubenswrapper[7518]: I0319 09:19:59.176089 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176223 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176230 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176284 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176266249 +0000 UTC m=+33.058849508 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176305 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176319 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.17630291 +0000 UTC m=+33.058886169 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176352 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176361 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176387 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176379612 +0000 UTC m=+33.058962981 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176404 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176426 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176575 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176613 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176658 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176694 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: I0319 09:19:59.176725 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176608 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176811 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176870 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176846284 +0000 UTC m=+33.059429603 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176872 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176903 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176909 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176901375 +0000 UTC m=+33.059484654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176744 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:19:59.176892 master-0 kubenswrapper[7518]: E0319 09:19:59.176928 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176919536 +0000 UTC m=+33.059502795 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.176944 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176936236 +0000 UTC m=+33.059519515 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.176963 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.176983 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.176976037 +0000 UTC m=+33.059559296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.176683 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177003 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177005 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.177000278 +0000 UTC m=+33.059583537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177030 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177039 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.177031138 +0000 UTC m=+33.059614407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177056 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.177048189 +0000 UTC m=+33.059631468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: I0319 09:19:59.176759 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177071 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.177061539 +0000 UTC m=+33.059644798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: I0319 09:19:59.177106 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177221 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:19:59.177789 master-0 kubenswrapper[7518]: E0319 09:19:59.177256 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:15.177247824 +0000 UTC m=+33.059831083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:19:59.992089 master-0 kubenswrapper[7518]: E0319 09:19:59.991991 7518 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3" Mar 19 09:19:59.992320 master-0 kubenswrapper[7518]: E0319 09:19:59.992209 7518 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:csi-snapshot-controller-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3,Command:[],Args:[start -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERAND_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24,ValueFrom:nil,},EnvVar{Name:WEBHOOK_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e46378af340ca82a8551fdfa20d0acf4ff4a5d43ceb0d4748eebc55be437d04,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b49lj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000160000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-snapshot-controller-operator-5f5d689c6b-dspnb_openshift-cluster-storage-operator(e09725c2-45c6-4a60-b817-6e5316d6f8e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:19:59.993462 master-0 kubenswrapper[7518]: E0319 09:19:59.993400 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" podUID="e09725c2-45c6-4a60-b817-6e5316d6f8e8" Mar 19 09:20:00.701381 master-0 kubenswrapper[7518]: I0319 09:20:00.701309 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-95w9b"] Mar 19 09:20:01.165341 master-0 kubenswrapper[7518]: W0319 09:20:01.165272 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod307605e6_d1cf_4172_8e7d_918c435f3577.slice/crio-f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37 WatchSource:0}: Error finding container f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37: Status 404 returned error can't find the container with id f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37 Mar 19 09:20:01.437752 master-0 kubenswrapper[7518]: I0319 09:20:01.437708 7518 generic.go:334] "Generic (PLEG): container finished" podID="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" containerID="3efe12fe8fe63c4780aeba64aa817d31d700162f5f08cf1695416899a639c633" exitCode=0 Mar 19 09:20:01.437969 master-0 kubenswrapper[7518]: I0319 09:20:01.437784 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerDied","Data":"3efe12fe8fe63c4780aeba64aa817d31d700162f5f08cf1695416899a639c633"} Mar 19 09:20:01.441332 master-0 kubenswrapper[7518]: I0319 09:20:01.441286 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" event={"ID":"357980ba-1957-412f-afb5-04281eca2bee","Type":"ContainerStarted","Data":"fdd9285acae300c3c00a66ae69c66c3dae68ae6703f408d0bdc875283085bf0e"} Mar 19 09:20:01.442693 master-0 kubenswrapper[7518]: I0319 09:20:01.442651 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-95w9b" event={"ID":"307605e6-d1cf-4172-8e7d-918c435f3577","Type":"ContainerStarted","Data":"f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37"} Mar 19 09:20:09.468555 master-0 kubenswrapper[7518]: I0319 09:20:09.467992 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-95w9b" event={"ID":"307605e6-d1cf-4172-8e7d-918c435f3577","Type":"ContainerStarted","Data":"0a96c4cc071492bb400801f0c8920d10434b09b3ffa489371b6c92e02f3443e4"} Mar 19 09:20:09.469295 master-0 kubenswrapper[7518]: I0319 09:20:09.468560 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:20:09.469704 master-0 kubenswrapper[7518]: I0319 09:20:09.469667 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2s58d" event={"ID":"bec90db1-02e3-4211-8c33-f8bcc304e3a7","Type":"ContainerStarted","Data":"eb34768aecd13df8a57436397abeeed767408121617b67e5e27dc1b91cbaff7e"} Mar 19 09:20:09.471151 master-0 kubenswrapper[7518]: I0319 09:20:09.471108 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" event={"ID":"083882c0-ea2f-4405-8cf1-cce5b91fe602","Type":"ContainerStarted","Data":"787b47766f4f361558a231cbdd8f60cfc309ddb2f5ce9e60ddd25ab14ca4bf8c"} Mar 19 09:20:09.472379 master-0 kubenswrapper[7518]: I0319 09:20:09.472354 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" event={"ID":"86c4b0e4-3481-465d-b00f-022d2c58c183","Type":"ContainerStarted","Data":"f771ab2ec3cdd043d42f5957ed84808b36b0f576aa969f9e8666ac7eb9b0b134"} Mar 19 09:20:09.473442 master-0 kubenswrapper[7518]: I0319 09:20:09.473421 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" event={"ID":"a75049de-dcf1-4102-b339-f45d5015adea","Type":"ContainerStarted","Data":"239a4aff890f70e77543607e882c4861b3b7d9ef6cf1f395add14a0ad7fc62e0"} Mar 19 09:20:15.253646 master-0 kubenswrapper[7518]: I0319 09:20:15.253283 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:20:15.253646 master-0 kubenswrapper[7518]: I0319 09:20:15.253645 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253453 7518 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253685 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253733 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253759 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253784 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics podName:33e92e5d-61ea-45b2-b357-ebffdaebf4af nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.253750405 +0000 UTC m=+65.136333684 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-6qck2" (UID: "33e92e5d-61ea-45b2-b357-ebffdaebf4af") : secret "marketplace-operator-metrics" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253805 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253832 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.253808576 +0000 UTC m=+65.136391925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253853 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.253845417 +0000 UTC m=+65.136428796 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253826 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253893 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253924 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253950 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253954 7518 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.253977 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.253996 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert podName:32b1ae47-ef83-448d-b40d-a836cb6c6fc0 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.253981161 +0000 UTC m=+65.136564530 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert") pod "cluster-version-operator-56d8475767-sbhx2" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0") : secret "cluster-version-operator-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254169 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.254187 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254204 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254195647 +0000 UTC m=+65.136778916 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.254226 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254041 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254284 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254276799 +0000 UTC m=+65.136860068 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254089 7518 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254381 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls podName:a417fe25-4aca-471c-941d-c195b6141042 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254352261 +0000 UTC m=+65.136935600 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-dcmsc" (UID: "a417fe25-4aca-471c-941d-c195b6141042") : secret "image-registry-operator-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254398 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.254415 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254460 7518 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254128 7518 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254531 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254501524 +0000 UTC m=+65.137084783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "node-tuning-operator-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254140 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254553 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls podName:6a8e2194-aba6-4929-a29c-47c63c8ff799 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254544045 +0000 UTC m=+65.137127304 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls") pod "ingress-operator-66b84d69b-pgdrx" (UID: "6a8e2194-aba6-4929-a29c-47c63c8ff799") : secret "metrics-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254577 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254565676 +0000 UTC m=+65.137149035 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254244 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254598 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert podName:9ac42112-6a00-4c17-b230-75b565aa668f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254591217 +0000 UTC m=+65.137174476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-wh9q6" (UID: "9ac42112-6a00-4c17-b230-75b565aa668f") : secret "performance-addon-operator-webhook-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254620 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254610267 +0000 UTC m=+65.137193686 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: I0319 09:20:15.254581 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254664 7518 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 09:20:15.254975 master-0 kubenswrapper[7518]: E0319 09:20:15.254710 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls podName:ece5177b-ae15-4c33-a8d4-612ab50b2b8b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.254694839 +0000 UTC m=+65.137278188 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls") pod "dns-operator-9c5679d8f-fdxtp" (UID: "ece5177b-ae15-4c33-a8d4-612ab50b2b8b") : secret "metrics-tls" not found Mar 19 09:20:16.180121 master-0 kubenswrapper[7518]: I0319 09:20:16.180033 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:20:16.180331 master-0 kubenswrapper[7518]: I0319 09:20:16.180274 7518 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:20:16.194603 master-0 kubenswrapper[7518]: I0319 09:20:16.194556 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:20:18.505408 master-0 kubenswrapper[7518]: I0319 09:20:18.504763 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" event={"ID":"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8","Type":"ContainerStarted","Data":"82689d1e71e4b8853162fd6caae5b840062273cb60c91d420d169ba6d7d40278"} Mar 19 09:20:18.506408 master-0 kubenswrapper[7518]: I0319 09:20:18.506361 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" event={"ID":"5b36f3b2-caf9-40ad-a3a1-e83796142f54","Type":"ContainerStarted","Data":"a9e3c64428edfb89f548d2d0f11b93a4546a142c8d9ea26eed5c6670f21e1d16"} Mar 19 09:20:18.507639 master-0 kubenswrapper[7518]: I0319 09:20:18.507603 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerStarted","Data":"0147b737bd4c355c56866c4e60a80701e47045be188939cc9ec3ede186a99781"} Mar 19 09:20:19.423963 master-0 kubenswrapper[7518]: I0319 09:20:19.423720 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk"] Mar 19 09:20:19.424207 master-0 kubenswrapper[7518]: E0319 09:20:19.424080 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:20:19.424207 master-0 kubenswrapper[7518]: I0319 09:20:19.424096 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:20:19.424207 master-0 kubenswrapper[7518]: E0319 09:20:19.424106 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:20:19.424207 master-0 kubenswrapper[7518]: I0319 09:20:19.424113 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:20:19.424360 master-0 kubenswrapper[7518]: I0319 09:20:19.424221 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:20:19.424360 master-0 kubenswrapper[7518]: I0319 09:20:19.424231 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:20:19.425807 master-0 kubenswrapper[7518]: I0319 09:20:19.424586 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:20:19.427519 master-0 kubenswrapper[7518]: I0319 09:20:19.427448 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:20:19.427972 master-0 kubenswrapper[7518]: I0319 09:20:19.427934 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:20:19.457112 master-0 kubenswrapper[7518]: I0319 09:20:19.457041 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk"] Mar 19 09:20:19.509733 master-0 kubenswrapper[7518]: I0319 09:20:19.509696 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zz2n\" (UniqueName: \"kubernetes.io/projected/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc-kube-api-access-2zz2n\") pod \"migrator-8487694857-nkvjk\" (UID: \"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:20:19.512142 master-0 kubenswrapper[7518]: I0319 09:20:19.512110 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" event={"ID":"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6","Type":"ContainerStarted","Data":"be05318150c766720e5d230c0bf2401720113751ff91aa74d2d72ed4d56c5f47"} Mar 19 09:20:19.513463 master-0 kubenswrapper[7518]: I0319 09:20:19.513432 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" event={"ID":"a823c8bc-09ef-46a9-a1f3-155a34b89788","Type":"ContainerStarted","Data":"fe703627bf17490741c98c350c37ad5f26868d707caaf28e298dbcd09ba6eb50"} Mar 19 09:20:19.518520 master-0 kubenswrapper[7518]: I0319 09:20:19.518484 7518 generic.go:334] "Generic (PLEG): container finished" podID="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" containerID="0147b737bd4c355c56866c4e60a80701e47045be188939cc9ec3ede186a99781" exitCode=0 Mar 19 09:20:19.518592 master-0 kubenswrapper[7518]: I0319 09:20:19.518496 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerDied","Data":"0147b737bd4c355c56866c4e60a80701e47045be188939cc9ec3ede186a99781"} Mar 19 09:20:19.520243 master-0 kubenswrapper[7518]: I0319 09:20:19.520206 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" event={"ID":"e09725c2-45c6-4a60-b817-6e5316d6f8e8","Type":"ContainerStarted","Data":"dbf3b7a4f4f0df660fb53b80cb02fbb3e8da389015b3269c1d219e3d7b1af269"} Mar 19 09:20:19.523951 master-0 kubenswrapper[7518]: I0319 09:20:19.523900 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" event={"ID":"9663cc40-a69d-42ba-890e-071cb85062f5","Type":"ContainerStarted","Data":"cdf18d2610050197f807cf4a5fc0308ba6a5aa77b434d76558194e6bb3ba81d0"} Mar 19 09:20:19.525268 master-0 kubenswrapper[7518]: I0319 09:20:19.525229 7518 generic.go:334] "Generic (PLEG): container finished" podID="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" containerID="82689d1e71e4b8853162fd6caae5b840062273cb60c91d420d169ba6d7d40278" exitCode=0 Mar 19 09:20:19.525350 master-0 kubenswrapper[7518]: I0319 09:20:19.525270 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" event={"ID":"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8","Type":"ContainerDied","Data":"82689d1e71e4b8853162fd6caae5b840062273cb60c91d420d169ba6d7d40278"} Mar 19 09:20:19.611408 master-0 kubenswrapper[7518]: I0319 09:20:19.610643 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zz2n\" (UniqueName: \"kubernetes.io/projected/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc-kube-api-access-2zz2n\") pod \"migrator-8487694857-nkvjk\" (UID: \"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:20:19.782401 master-0 kubenswrapper[7518]: I0319 09:20:19.782347 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zz2n\" (UniqueName: \"kubernetes.io/projected/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc-kube-api-access-2zz2n\") pod \"migrator-8487694857-nkvjk\" (UID: \"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:20:20.040237 master-0 kubenswrapper[7518]: I0319 09:20:20.040034 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:20:20.992459 master-0 kubenswrapper[7518]: I0319 09:20:20.991194 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk"] Mar 19 09:20:21.033163 master-0 kubenswrapper[7518]: W0319 09:20:21.033123 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d3a3480_9f1f_4dd1_b58d_9721e4f18fbc.slice/crio-daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1 WatchSource:0}: Error finding container daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1: Status 404 returned error can't find the container with id daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1 Mar 19 09:20:21.539037 master-0 kubenswrapper[7518]: I0319 09:20:21.538977 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" event={"ID":"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc","Type":"ContainerStarted","Data":"daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1"} Mar 19 09:20:21.698509 master-0 kubenswrapper[7518]: I0319 09:20:21.697329 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p"] Mar 19 09:20:21.700388 master-0 kubenswrapper[7518]: I0319 09:20:21.699499 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.703216 master-0 kubenswrapper[7518]: I0319 09:20:21.703158 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:21.703399 master-0 kubenswrapper[7518]: I0319 09:20:21.703366 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:20:21.703535 master-0 kubenswrapper[7518]: I0319 09:20:21.703514 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:20:21.704867 master-0 kubenswrapper[7518]: I0319 09:20:21.704825 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:21.705080 master-0 kubenswrapper[7518]: I0319 09:20:21.705057 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:20:21.738124 master-0 kubenswrapper[7518]: I0319 09:20:21.738065 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.738399 master-0 kubenswrapper[7518]: I0319 09:20:21.738142 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.738399 master-0 kubenswrapper[7518]: I0319 09:20:21.738169 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87w2n\" (UniqueName: \"kubernetes.io/projected/dcbab571-c45a-490f-bd7b-8d3c519e6d03-kube-api-access-87w2n\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.738399 master-0 kubenswrapper[7518]: I0319 09:20:21.738192 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.839774 master-0 kubenswrapper[7518]: I0319 09:20:21.839650 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.839963 master-0 kubenswrapper[7518]: E0319 09:20:21.839821 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:21.839963 master-0 kubenswrapper[7518]: E0319 09:20:21.839912 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:22.339888411 +0000 UTC m=+40.222471670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "client-ca" not found Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: I0319 09:20:21.840130 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: I0319 09:20:21.840202 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87w2n\" (UniqueName: \"kubernetes.io/projected/dcbab571-c45a-490f-bd7b-8d3c519e6d03-kube-api-access-87w2n\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: I0319 09:20:21.840238 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: E0319 09:20:21.840259 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: E0319 09:20:21.840323 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:22.340300236 +0000 UTC m=+40.222883495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : secret "serving-cert" not found Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: E0319 09:20:21.840363 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 19 09:20:21.840437 master-0 kubenswrapper[7518]: E0319 09:20:21.840413 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:22.340399029 +0000 UTC m=+40.222982288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "config" not found Mar 19 09:20:22.346843 master-0 kubenswrapper[7518]: I0319 09:20:22.346448 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:22.346843 master-0 kubenswrapper[7518]: I0319 09:20:22.346850 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: E0319 09:20:22.346646 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: E0319 09:20:22.346948 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:23.346934098 +0000 UTC m=+41.229517357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : secret "serving-cert" not found Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: E0319 09:20:22.346971 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: I0319 09:20:22.346986 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: E0319 09:20:22.346998 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:23.34699008 +0000 UTC m=+41.229573339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "config" not found Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: E0319 09:20:22.347023 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:22.347589 master-0 kubenswrapper[7518]: E0319 09:20:22.347052 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:23.347045173 +0000 UTC m=+41.229628432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "client-ca" not found Mar 19 09:20:23.362053 master-0 kubenswrapper[7518]: I0319 09:20:23.361961 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: E0319 09:20:23.362109 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: E0319 09:20:23.362200 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:25.362173042 +0000 UTC m=+43.244756331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "client-ca" not found Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: I0319 09:20:23.362248 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: I0319 09:20:23.362329 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: E0319 09:20:23.362440 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: E0319 09:20:23.362525 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:25.362511434 +0000 UTC m=+43.245094733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "config" not found Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: E0319 09:20:23.362623 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:23.363147 master-0 kubenswrapper[7518]: E0319 09:20:23.362692 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:25.362675679 +0000 UTC m=+43.245258978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : secret "serving-cert" not found Mar 19 09:20:23.886971 master-0 kubenswrapper[7518]: I0319 09:20:23.886896 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p"] Mar 19 09:20:25.386935 master-0 kubenswrapper[7518]: I0319 09:20:25.386866 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:25.386935 master-0 kubenswrapper[7518]: I0319 09:20:25.386941 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: E0319 09:20:25.387066 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: E0319 09:20:25.387157 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:29.387136021 +0000 UTC m=+47.269719280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "client-ca" not found Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: I0319 09:20:25.387188 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: E0319 09:20:25.387228 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: E0319 09:20:25.387262 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: E0319 09:20:25.387280 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:29.387269026 +0000 UTC m=+47.269852285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : secret "serving-cert" not found Mar 19 09:20:25.387802 master-0 kubenswrapper[7518]: E0319 09:20:25.387315 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:29.387300607 +0000 UTC m=+47.269883866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "config" not found Mar 19 09:20:29.432358 master-0 kubenswrapper[7518]: I0319 09:20:29.432290 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: E0319 09:20:29.432447 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: E0319 09:20:29.432570 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.432551607 +0000 UTC m=+55.315134946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "config" not found Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: I0319 09:20:29.432720 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: I0319 09:20:29.432777 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: E0319 09:20:29.432826 7518 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: E0319 09:20:29.432871 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.432859807 +0000 UTC m=+55.315443066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : configmap "client-ca" not found Mar 19 09:20:29.433018 master-0 kubenswrapper[7518]: E0319 09:20:29.432999 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:29.433257 master-0 kubenswrapper[7518]: E0319 09:20:29.433062 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert podName:dcbab571-c45a-490f-bd7b-8d3c519e6d03 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.433047273 +0000 UTC m=+55.315630532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert") pod "route-controller-manager-6d9bb777f5-x9r5p" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03") : secret "serving-cert" not found Mar 19 09:20:32.780500 master-0 kubenswrapper[7518]: I0319 09:20:32.779563 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87w2n\" (UniqueName: \"kubernetes.io/projected/dcbab571-c45a-490f-bd7b-8d3c519e6d03-kube-api-access-87w2n\") pod \"route-controller-manager-6d9bb777f5-x9r5p\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:34.524497 master-0 kubenswrapper[7518]: I0319 09:20:34.524273 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb"] Mar 19 09:20:34.525121 master-0 kubenswrapper[7518]: I0319 09:20:34.524936 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:20:34.611947 master-0 kubenswrapper[7518]: I0319 09:20:34.610445 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgsm7\" (UniqueName: \"kubernetes.io/projected/e3376275-294d-446d-9b4c-930df60dba01-kube-api-access-cgsm7\") pod \"csi-snapshot-controller-64854d9cff-dzfgb\" (UID: \"e3376275-294d-446d-9b4c-930df60dba01\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:20:34.711185 master-0 kubenswrapper[7518]: I0319 09:20:34.711110 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgsm7\" (UniqueName: \"kubernetes.io/projected/e3376275-294d-446d-9b4c-930df60dba01-kube-api-access-cgsm7\") pod \"csi-snapshot-controller-64854d9cff-dzfgb\" (UID: \"e3376275-294d-446d-9b4c-930df60dba01\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:20:34.756549 master-0 kubenswrapper[7518]: I0319 09:20:34.754215 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb"] Mar 19 09:20:34.806637 master-0 kubenswrapper[7518]: I0319 09:20:34.803161 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgsm7\" (UniqueName: \"kubernetes.io/projected/e3376275-294d-446d-9b4c-930df60dba01-kube-api-access-cgsm7\") pod \"csi-snapshot-controller-64854d9cff-dzfgb\" (UID: \"e3376275-294d-446d-9b4c-930df60dba01\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:20:34.846657 master-0 kubenswrapper[7518]: I0319 09:20:34.845941 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:20:35.072210 master-0 kubenswrapper[7518]: I0319 09:20:35.069185 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-dhd9c"] Mar 19 09:20:35.072210 master-0 kubenswrapper[7518]: I0319 09:20:35.069861 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.080045 master-0 kubenswrapper[7518]: I0319 09:20:35.079961 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:20:35.080868 master-0 kubenswrapper[7518]: I0319 09:20:35.080406 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:20:35.080868 master-0 kubenswrapper[7518]: I0319 09:20:35.080559 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:20:35.080868 master-0 kubenswrapper[7518]: I0319 09:20:35.080594 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:20:35.080868 master-0 kubenswrapper[7518]: I0319 09:20:35.080680 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:35.080868 master-0 kubenswrapper[7518]: I0319 09:20:35.080590 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:20:35.080868 master-0 kubenswrapper[7518]: I0319 09:20:35.080760 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:20:35.081066 master-0 kubenswrapper[7518]: I0319 09:20:35.080958 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:20:35.102488 master-0 kubenswrapper[7518]: I0319 09:20:35.102131 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-dhd9c"] Mar 19 09:20:35.128434 master-0 kubenswrapper[7518]: I0319 09:20:35.128379 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128725 master-0 kubenswrapper[7518]: I0319 09:20:35.128531 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128725 master-0 kubenswrapper[7518]: I0319 09:20:35.128615 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-dir\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128725 master-0 kubenswrapper[7518]: I0319 09:20:35.128665 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-encryption-config\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128725 master-0 kubenswrapper[7518]: I0319 09:20:35.128716 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7jg\" (UniqueName: \"kubernetes.io/projected/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-kube-api-access-4w7jg\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128725 master-0 kubenswrapper[7518]: I0319 09:20:35.128731 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-policies\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128889 master-0 kubenswrapper[7518]: I0319 09:20:35.128748 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.128889 master-0 kubenswrapper[7518]: I0319 09:20:35.128764 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-trusted-ca-bundle\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236146 master-0 kubenswrapper[7518]: I0319 09:20:35.236097 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7jg\" (UniqueName: \"kubernetes.io/projected/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-kube-api-access-4w7jg\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236146 master-0 kubenswrapper[7518]: I0319 09:20:35.236152 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-policies\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236146 master-0 kubenswrapper[7518]: I0319 09:20:35.236170 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-trusted-ca-bundle\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236500 master-0 kubenswrapper[7518]: I0319 09:20:35.236188 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236500 master-0 kubenswrapper[7518]: I0319 09:20:35.236207 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236500 master-0 kubenswrapper[7518]: I0319 09:20:35.236223 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236500 master-0 kubenswrapper[7518]: I0319 09:20:35.236347 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-dir\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.236500 master-0 kubenswrapper[7518]: I0319 09:20:35.236419 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-encryption-config\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: E0319 09:20:35.237001 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: E0319 09:20:35.237090 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.737067617 +0000 UTC m=+53.619650876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "serving-cert" not found Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: E0319 09:20:35.237312 7518 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: E0319 09:20:35.237428 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.737394388 +0000 UTC m=+53.619977747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : configmap "etcd-serving-ca" not found Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: E0319 09:20:35.237543 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: E0319 09:20:35.237577 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.737568074 +0000 UTC m=+53.620151453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "etcd-client" not found Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: I0319 09:20:35.237619 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-dir\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.238159 master-0 kubenswrapper[7518]: I0319 09:20:35.238101 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p"] Mar 19 09:20:35.238543 master-0 kubenswrapper[7518]: E0319 09:20:35.238513 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" podUID="dcbab571-c45a-490f-bd7b-8d3c519e6d03" Mar 19 09:20:35.238592 master-0 kubenswrapper[7518]: I0319 09:20:35.238525 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-trusted-ca-bundle\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.238913 master-0 kubenswrapper[7518]: I0319 09:20:35.238863 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-policies\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.241547 master-0 kubenswrapper[7518]: I0319 09:20:35.240521 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-encryption-config\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.242531 master-0 kubenswrapper[7518]: I0319 09:20:35.242068 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-866d56f9b-6dc8n"] Mar 19 09:20:35.242716 master-0 kubenswrapper[7518]: I0319 09:20:35.242615 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.253214 master-0 kubenswrapper[7518]: I0319 09:20:35.253133 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:35.253400 master-0 kubenswrapper[7518]: I0319 09:20:35.253254 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:20:35.253708 master-0 kubenswrapper[7518]: I0319 09:20:35.253519 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:20:35.253708 master-0 kubenswrapper[7518]: I0319 09:20:35.253564 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:20:35.253708 master-0 kubenswrapper[7518]: I0319 09:20:35.253603 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:35.259160 master-0 kubenswrapper[7518]: I0319 09:20:35.259108 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866d56f9b-6dc8n"] Mar 19 09:20:35.259922 master-0 kubenswrapper[7518]: I0319 09:20:35.259886 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:20:35.277190 master-0 kubenswrapper[7518]: I0319 09:20:35.275773 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7jg\" (UniqueName: \"kubernetes.io/projected/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-kube-api-access-4w7jg\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.338704 master-0 kubenswrapper[7518]: I0319 09:20:35.337933 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.338704 master-0 kubenswrapper[7518]: I0319 09:20:35.338017 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbfh6\" (UniqueName: \"kubernetes.io/projected/cfdbfc13-79f9-4369-990f-29b31f7ec8da-kube-api-access-fbfh6\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.338704 master-0 kubenswrapper[7518]: I0319 09:20:35.338061 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-proxy-ca-bundles\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.338704 master-0 kubenswrapper[7518]: I0319 09:20:35.338240 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-config\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.338704 master-0 kubenswrapper[7518]: I0319 09:20:35.338679 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-client-ca\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.427257 master-0 kubenswrapper[7518]: I0319 09:20:35.427201 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-xlhg9"] Mar 19 09:20:35.427767 master-0 kubenswrapper[7518]: I0319 09:20:35.427749 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.430481 master-0 kubenswrapper[7518]: I0319 09:20:35.430389 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:20:35.430699 master-0 kubenswrapper[7518]: I0319 09:20:35.430677 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:20:35.430871 master-0 kubenswrapper[7518]: I0319 09:20:35.430850 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:20:35.432846 master-0 kubenswrapper[7518]: I0319 09:20:35.432826 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:20:35.439329 master-0 kubenswrapper[7518]: I0319 09:20:35.439290 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-config\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.439484 master-0 kubenswrapper[7518]: I0319 09:20:35.439449 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-client-ca\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.439712 master-0 kubenswrapper[7518]: I0319 09:20:35.439662 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.439809 master-0 kubenswrapper[7518]: I0319 09:20:35.439751 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbfh6\" (UniqueName: \"kubernetes.io/projected/cfdbfc13-79f9-4369-990f-29b31f7ec8da-kube-api-access-fbfh6\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.439863 master-0 kubenswrapper[7518]: I0319 09:20:35.439834 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-proxy-ca-bundles\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.439986 master-0 kubenswrapper[7518]: E0319 09:20:35.439953 7518 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.440083 master-0 kubenswrapper[7518]: E0319 09:20:35.440066 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert podName:cfdbfc13-79f9-4369-990f-29b31f7ec8da nodeName:}" failed. No retries permitted until 2026-03-19 09:20:35.940035405 +0000 UTC m=+53.822618654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert") pod "controller-manager-866d56f9b-6dc8n" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da") : secret "serving-cert" not found Mar 19 09:20:35.440458 master-0 kubenswrapper[7518]: I0319 09:20:35.440424 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-config\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.441676 master-0 kubenswrapper[7518]: I0319 09:20:35.441631 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-proxy-ca-bundles\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.442425 master-0 kubenswrapper[7518]: I0319 09:20:35.442398 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-client-ca\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.447148 master-0 kubenswrapper[7518]: I0319 09:20:35.447114 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-xlhg9"] Mar 19 09:20:35.468395 master-0 kubenswrapper[7518]: I0319 09:20:35.468293 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbfh6\" (UniqueName: \"kubernetes.io/projected/cfdbfc13-79f9-4369-990f-29b31f7ec8da-kube-api-access-fbfh6\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.540817 master-0 kubenswrapper[7518]: I0319 09:20:35.540644 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljb2\" (UniqueName: \"kubernetes.io/projected/d90f590a-6118-4769-b18f-fec67dd62c20-kube-api-access-nljb2\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.541595 master-0 kubenswrapper[7518]: I0319 09:20:35.541058 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d90f590a-6118-4769-b18f-fec67dd62c20-signing-key\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.541595 master-0 kubenswrapper[7518]: I0319 09:20:35.541402 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d90f590a-6118-4769-b18f-fec67dd62c20-signing-cabundle\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.605560 master-0 kubenswrapper[7518]: I0319 09:20:35.605445 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:35.612403 master-0 kubenswrapper[7518]: I0319 09:20:35.612354 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:35.642742 master-0 kubenswrapper[7518]: I0319 09:20:35.642677 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljb2\" (UniqueName: \"kubernetes.io/projected/d90f590a-6118-4769-b18f-fec67dd62c20-kube-api-access-nljb2\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.642961 master-0 kubenswrapper[7518]: I0319 09:20:35.642904 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d90f590a-6118-4769-b18f-fec67dd62c20-signing-key\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.643018 master-0 kubenswrapper[7518]: I0319 09:20:35.642982 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d90f590a-6118-4769-b18f-fec67dd62c20-signing-cabundle\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.643979 master-0 kubenswrapper[7518]: I0319 09:20:35.643942 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d90f590a-6118-4769-b18f-fec67dd62c20-signing-cabundle\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.646906 master-0 kubenswrapper[7518]: I0319 09:20:35.646869 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d90f590a-6118-4769-b18f-fec67dd62c20-signing-key\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.658609 master-0 kubenswrapper[7518]: I0319 09:20:35.658559 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljb2\" (UniqueName: \"kubernetes.io/projected/d90f590a-6118-4769-b18f-fec67dd62c20-kube-api-access-nljb2\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.744099 master-0 kubenswrapper[7518]: I0319 09:20:35.743609 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:20:35.744099 master-0 kubenswrapper[7518]: I0319 09:20:35.743783 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87w2n\" (UniqueName: \"kubernetes.io/projected/dcbab571-c45a-490f-bd7b-8d3c519e6d03-kube-api-access-87w2n\") pod \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\" (UID: \"dcbab571-c45a-490f-bd7b-8d3c519e6d03\") " Mar 19 09:20:35.744324 master-0 kubenswrapper[7518]: I0319 09:20:35.744141 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.744324 master-0 kubenswrapper[7518]: E0319 09:20:35.744266 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.744400 master-0 kubenswrapper[7518]: E0319 09:20:35.744331 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:35.744400 master-0 kubenswrapper[7518]: E0319 09:20:35.744336 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.744313961 +0000 UTC m=+54.626897220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "serving-cert" not found Mar 19 09:20:35.744400 master-0 kubenswrapper[7518]: E0319 09:20:35.744372 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.744362753 +0000 UTC m=+54.626946012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "etcd-client" not found Mar 19 09:20:35.744400 master-0 kubenswrapper[7518]: I0319 09:20:35.744275 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.744400 master-0 kubenswrapper[7518]: I0319 09:20:35.744400 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:35.744686 master-0 kubenswrapper[7518]: E0319 09:20:35.744625 7518 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:35.744914 master-0 kubenswrapper[7518]: E0319 09:20:35.744700 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.744682333 +0000 UTC m=+54.627265592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : configmap "etcd-serving-ca" not found Mar 19 09:20:35.747016 master-0 kubenswrapper[7518]: I0319 09:20:35.746941 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcbab571-c45a-490f-bd7b-8d3c519e6d03-kube-api-access-87w2n" (OuterVolumeSpecName: "kube-api-access-87w2n") pod "dcbab571-c45a-490f-bd7b-8d3c519e6d03" (UID: "dcbab571-c45a-490f-bd7b-8d3c519e6d03"). InnerVolumeSpecName "kube-api-access-87w2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:35.846459 master-0 kubenswrapper[7518]: I0319 09:20:35.846407 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87w2n\" (UniqueName: \"kubernetes.io/projected/dcbab571-c45a-490f-bd7b-8d3c519e6d03-kube-api-access-87w2n\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:35.950875 master-0 kubenswrapper[7518]: I0319 09:20:35.947871 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:35.950875 master-0 kubenswrapper[7518]: E0319 09:20:35.948092 7518 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:35.950875 master-0 kubenswrapper[7518]: E0319 09:20:35.948196 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert podName:cfdbfc13-79f9-4369-990f-29b31f7ec8da nodeName:}" failed. No retries permitted until 2026-03-19 09:20:36.948172439 +0000 UTC m=+54.830755878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert") pod "controller-manager-866d56f9b-6dc8n" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da") : secret "serving-cert" not found Mar 19 09:20:36.066173 master-0 kubenswrapper[7518]: I0319 09:20:36.065999 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb"] Mar 19 09:20:36.097653 master-0 kubenswrapper[7518]: I0319 09:20:36.097561 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-xlhg9"] Mar 19 09:20:36.611062 master-0 kubenswrapper[7518]: I0319 09:20:36.611006 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" event={"ID":"d90f590a-6118-4769-b18f-fec67dd62c20","Type":"ContainerStarted","Data":"b3be33b5d3d587329e4bed8824638df74ead369512b29f24a662b38a316b3521"} Mar 19 09:20:36.611899 master-0 kubenswrapper[7518]: I0319 09:20:36.611880 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" event={"ID":"d90f590a-6118-4769-b18f-fec67dd62c20","Type":"ContainerStarted","Data":"83f05b1eef52787aaeaed1465a46122a61b271c0e893c29d510caa22b344a675"} Mar 19 09:20:36.613176 master-0 kubenswrapper[7518]: I0319 09:20:36.613149 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" event={"ID":"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8","Type":"ContainerStarted","Data":"a481a6ff530440a1264d2535843bd9da5aad52194298733f7093828af5a8bb83"} Mar 19 09:20:36.613428 master-0 kubenswrapper[7518]: I0319 09:20:36.613400 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:20:36.615129 master-0 kubenswrapper[7518]: I0319 09:20:36.615101 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" event={"ID":"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc","Type":"ContainerStarted","Data":"7d2c1f8d935c4dfeafcba1ad99f176625483ec4c214eaa8e34d2d0008c5f24ea"} Mar 19 09:20:36.615196 master-0 kubenswrapper[7518]: I0319 09:20:36.615130 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" event={"ID":"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc","Type":"ContainerStarted","Data":"f5a30490fb04961a46cd9ca4393e352524ff666303c2e604075865b43a7f9094"} Mar 19 09:20:36.616124 master-0 kubenswrapper[7518]: I0319 09:20:36.616100 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerStarted","Data":"9bea8e39775551acb259adea0fc4cfd103c16875f290afb2712a31409a51f01c"} Mar 19 09:20:36.618120 master-0 kubenswrapper[7518]: I0319 09:20:36.618100 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerStarted","Data":"b4cd172092883e2c59c413605caa9eda30c5b4011ddd9168033acc5dfa87297f"} Mar 19 09:20:36.618230 master-0 kubenswrapper[7518]: I0319 09:20:36.618213 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p" Mar 19 09:20:36.651087 master-0 kubenswrapper[7518]: I0319 09:20:36.650948 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" podStartSLOduration=1.6509294780000001 podStartE2EDuration="1.650929478s" podCreationTimestamp="2026-03-19 09:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:36.649371896 +0000 UTC m=+54.531955185" watchObservedRunningTime="2026-03-19 09:20:36.650929478 +0000 UTC m=+54.533512737" Mar 19 09:20:36.692273 master-0 kubenswrapper[7518]: I0319 09:20:36.691171 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" podStartSLOduration=2.849287989 podStartE2EDuration="17.691151119s" podCreationTimestamp="2026-03-19 09:20:19 +0000 UTC" firstStartedPulling="2026-03-19 09:20:21.034748427 +0000 UTC m=+38.917331686" lastFinishedPulling="2026-03-19 09:20:35.876611537 +0000 UTC m=+53.759194816" observedRunningTime="2026-03-19 09:20:36.66869498 +0000 UTC m=+54.551278249" watchObservedRunningTime="2026-03-19 09:20:36.691151119 +0000 UTC m=+54.573734388" Mar 19 09:20:36.706125 master-0 kubenswrapper[7518]: I0319 09:20:36.705910 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4"] Mar 19 09:20:36.706741 master-0 kubenswrapper[7518]: I0319 09:20:36.706538 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.709434 master-0 kubenswrapper[7518]: I0319 09:20:36.709103 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p"] Mar 19 09:20:36.709597 master-0 kubenswrapper[7518]: I0319 09:20:36.709560 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:20:36.711499 master-0 kubenswrapper[7518]: I0319 09:20:36.710550 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:20:36.711499 master-0 kubenswrapper[7518]: I0319 09:20:36.710820 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:20:36.711499 master-0 kubenswrapper[7518]: I0319 09:20:36.711047 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:36.711499 master-0 kubenswrapper[7518]: I0319 09:20:36.711384 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:36.730894 master-0 kubenswrapper[7518]: I0319 09:20:36.730195 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9bb777f5-x9r5p"] Mar 19 09:20:36.748912 master-0 kubenswrapper[7518]: I0319 09:20:36.748827 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4"] Mar 19 09:20:36.762460 master-0 kubenswrapper[7518]: I0319 09:20:36.762399 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:36.762460 master-0 kubenswrapper[7518]: I0319 09:20:36.762458 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:36.762769 master-0 kubenswrapper[7518]: I0319 09:20:36.762505 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:36.764997 master-0 kubenswrapper[7518]: E0319 09:20:36.764954 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:36.765178 master-0 kubenswrapper[7518]: E0319 09:20:36.765071 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.76504393 +0000 UTC m=+56.647627259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "etcd-client" not found Mar 19 09:20:36.765178 master-0 kubenswrapper[7518]: E0319 09:20:36.765166 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:36.765271 master-0 kubenswrapper[7518]: E0319 09:20:36.765215 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.765185465 +0000 UTC m=+56.647768804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "serving-cert" not found Mar 19 09:20:36.766680 master-0 kubenswrapper[7518]: E0319 09:20:36.765761 7518 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:36.766680 master-0 kubenswrapper[7518]: E0319 09:20:36.765802 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.765792236 +0000 UTC m=+56.648375545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : configmap "etcd-serving-ca" not found Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.863735 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xqr\" (UniqueName: \"kubernetes.io/projected/33f8ede1-66c2-4a48-a9c9-32002408150f-kube-api-access-47xqr\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.863876 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-client-ca\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.863755 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866d56f9b-6dc8n"] Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.864031 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: E0319 09:20:36.864319 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" podUID="cfdbfc13-79f9-4369-990f-29b31f7ec8da" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.864428 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-config\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.864508 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbab571-c45a-490f-bd7b-8d3c519e6d03-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.864524 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:36.865549 master-0 kubenswrapper[7518]: I0319 09:20:36.864551 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dcbab571-c45a-490f-bd7b-8d3c519e6d03-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:36.901548 master-0 kubenswrapper[7518]: I0319 09:20:36.901305 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4"] Mar 19 09:20:36.901758 master-0 kubenswrapper[7518]: E0319 09:20:36.901609 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-47xqr serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" podUID="33f8ede1-66c2-4a48-a9c9-32002408150f" Mar 19 09:20:36.965095 master-0 kubenswrapper[7518]: I0319 09:20:36.965017 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.965095 master-0 kubenswrapper[7518]: I0319 09:20:36.965076 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-config\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.965095 master-0 kubenswrapper[7518]: I0319 09:20:36.965104 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47xqr\" (UniqueName: \"kubernetes.io/projected/33f8ede1-66c2-4a48-a9c9-32002408150f-kube-api-access-47xqr\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.965364 master-0 kubenswrapper[7518]: E0319 09:20:36.965222 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:36.965364 master-0 kubenswrapper[7518]: E0319 09:20:36.965296 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert podName:33f8ede1-66c2-4a48-a9c9-32002408150f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:37.465278925 +0000 UTC m=+55.347862184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert") pod "route-controller-manager-58bdf45c89-nnbc4" (UID: "33f8ede1-66c2-4a48-a9c9-32002408150f") : secret "serving-cert" not found Mar 19 09:20:36.965460 master-0 kubenswrapper[7518]: I0319 09:20:36.965370 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-client-ca\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.965537 master-0 kubenswrapper[7518]: I0319 09:20:36.965489 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert\") pod \"controller-manager-866d56f9b-6dc8n\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:36.965628 master-0 kubenswrapper[7518]: E0319 09:20:36.965594 7518 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:36.965628 master-0 kubenswrapper[7518]: E0319 09:20:36.965627 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert podName:cfdbfc13-79f9-4369-990f-29b31f7ec8da nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.965621177 +0000 UTC m=+56.848204436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert") pod "controller-manager-866d56f9b-6dc8n" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da") : secret "serving-cert" not found Mar 19 09:20:36.966231 master-0 kubenswrapper[7518]: I0319 09:20:36.966188 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-config\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.966307 master-0 kubenswrapper[7518]: I0319 09:20:36.966281 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-client-ca\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:36.994389 master-0 kubenswrapper[7518]: I0319 09:20:36.994338 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xqr\" (UniqueName: \"kubernetes.io/projected/33f8ede1-66c2-4a48-a9c9-32002408150f-kube-api-access-47xqr\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:37.486727 master-0 kubenswrapper[7518]: I0319 09:20:37.486554 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:37.486727 master-0 kubenswrapper[7518]: E0319 09:20:37.486704 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:37.486967 master-0 kubenswrapper[7518]: E0319 09:20:37.486758 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert podName:33f8ede1-66c2-4a48-a9c9-32002408150f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.48674296 +0000 UTC m=+56.369326219 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert") pod "route-controller-manager-58bdf45c89-nnbc4" (UID: "33f8ede1-66c2-4a48-a9c9-32002408150f") : secret "serving-cert" not found Mar 19 09:20:37.644311 master-0 kubenswrapper[7518]: I0319 09:20:37.644272 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:37.646460 master-0 kubenswrapper[7518]: I0319 09:20:37.646445 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:37.719493 master-0 kubenswrapper[7518]: I0319 09:20:37.716762 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:37.729319 master-0 kubenswrapper[7518]: I0319 09:20:37.725021 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:37.904058 master-0 kubenswrapper[7518]: I0319 09:20:37.903929 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-config\") pod \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " Mar 19 09:20:37.904058 master-0 kubenswrapper[7518]: I0319 09:20:37.903997 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-config\") pod \"33f8ede1-66c2-4a48-a9c9-32002408150f\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " Mar 19 09:20:37.904058 master-0 kubenswrapper[7518]: I0319 09:20:37.904027 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-client-ca\") pod \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " Mar 19 09:20:37.904058 master-0 kubenswrapper[7518]: I0319 09:20:37.904049 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-client-ca\") pod \"33f8ede1-66c2-4a48-a9c9-32002408150f\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " Mar 19 09:20:37.904386 master-0 kubenswrapper[7518]: I0319 09:20:37.904089 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbfh6\" (UniqueName: \"kubernetes.io/projected/cfdbfc13-79f9-4369-990f-29b31f7ec8da-kube-api-access-fbfh6\") pod \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " Mar 19 09:20:37.904386 master-0 kubenswrapper[7518]: I0319 09:20:37.904120 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-proxy-ca-bundles\") pod \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\" (UID: \"cfdbfc13-79f9-4369-990f-29b31f7ec8da\") " Mar 19 09:20:37.904386 master-0 kubenswrapper[7518]: I0319 09:20:37.904152 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47xqr\" (UniqueName: \"kubernetes.io/projected/33f8ede1-66c2-4a48-a9c9-32002408150f-kube-api-access-47xqr\") pod \"33f8ede1-66c2-4a48-a9c9-32002408150f\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " Mar 19 09:20:37.905208 master-0 kubenswrapper[7518]: I0319 09:20:37.905144 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-client-ca" (OuterVolumeSpecName: "client-ca") pod "cfdbfc13-79f9-4369-990f-29b31f7ec8da" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:37.905988 master-0 kubenswrapper[7518]: I0319 09:20:37.905889 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-config" (OuterVolumeSpecName: "config") pod "33f8ede1-66c2-4a48-a9c9-32002408150f" (UID: "33f8ede1-66c2-4a48-a9c9-32002408150f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:37.905988 master-0 kubenswrapper[7518]: I0319 09:20:37.905890 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cfdbfc13-79f9-4369-990f-29b31f7ec8da" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:37.906173 master-0 kubenswrapper[7518]: I0319 09:20:37.906125 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-client-ca" (OuterVolumeSpecName: "client-ca") pod "33f8ede1-66c2-4a48-a9c9-32002408150f" (UID: "33f8ede1-66c2-4a48-a9c9-32002408150f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:37.906372 master-0 kubenswrapper[7518]: I0319 09:20:37.906352 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-config" (OuterVolumeSpecName: "config") pod "cfdbfc13-79f9-4369-990f-29b31f7ec8da" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:37.909138 master-0 kubenswrapper[7518]: I0319 09:20:37.909060 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfdbfc13-79f9-4369-990f-29b31f7ec8da-kube-api-access-fbfh6" (OuterVolumeSpecName: "kube-api-access-fbfh6") pod "cfdbfc13-79f9-4369-990f-29b31f7ec8da" (UID: "cfdbfc13-79f9-4369-990f-29b31f7ec8da"). InnerVolumeSpecName "kube-api-access-fbfh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:37.913672 master-0 kubenswrapper[7518]: I0319 09:20:37.913619 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f8ede1-66c2-4a48-a9c9-32002408150f-kube-api-access-47xqr" (OuterVolumeSpecName: "kube-api-access-47xqr") pod "33f8ede1-66c2-4a48-a9c9-32002408150f" (UID: "33f8ede1-66c2-4a48-a9c9-32002408150f"). InnerVolumeSpecName "kube-api-access-47xqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:37.975662 master-0 kubenswrapper[7518]: I0319 09:20:37.975604 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-cvhxl"] Mar 19 09:20:37.976457 master-0 kubenswrapper[7518]: I0319 09:20:37.976432 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:37.978554 master-0 kubenswrapper[7518]: I0319 09:20:37.978509 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:20:37.984708 master-0 kubenswrapper[7518]: I0319 09:20:37.984663 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 19 09:20:37.984901 master-0 kubenswrapper[7518]: I0319 09:20:37.984874 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:20:37.985041 master-0 kubenswrapper[7518]: I0319 09:20:37.985017 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:20:37.985168 master-0 kubenswrapper[7518]: I0319 09:20:37.985143 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:20:37.985270 master-0 kubenswrapper[7518]: I0319 09:20:37.985245 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:37.985386 master-0 kubenswrapper[7518]: I0319 09:20:37.985362 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:20:37.985492 master-0 kubenswrapper[7518]: I0319 09:20:37.985456 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:20:37.985675 master-0 kubenswrapper[7518]: I0319 09:20:37.985621 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 19 09:20:37.990367 master-0 kubenswrapper[7518]: I0319 09:20:37.990334 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:20:37.991650 master-0 kubenswrapper[7518]: I0319 09:20:37.991630 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-cvhxl"] Mar 19 09:20:38.009195 master-0 kubenswrapper[7518]: I0319 09:20:38.009145 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbfh6\" (UniqueName: \"kubernetes.io/projected/cfdbfc13-79f9-4369-990f-29b31f7ec8da-kube-api-access-fbfh6\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.009195 master-0 kubenswrapper[7518]: I0319 09:20:38.009189 7518 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.009195 master-0 kubenswrapper[7518]: I0319 09:20:38.009205 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47xqr\" (UniqueName: \"kubernetes.io/projected/33f8ede1-66c2-4a48-a9c9-32002408150f-kube-api-access-47xqr\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.009451 master-0 kubenswrapper[7518]: I0319 09:20:38.009219 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.009451 master-0 kubenswrapper[7518]: I0319 09:20:38.009232 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.009451 master-0 kubenswrapper[7518]: I0319 09:20:38.009243 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cfdbfc13-79f9-4369-990f-29b31f7ec8da-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.009451 master-0 kubenswrapper[7518]: I0319 09:20:38.009255 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33f8ede1-66c2-4a48-a9c9-32002408150f-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110053 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-config\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110139 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-trusted-ca-bundle\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110257 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit-dir\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110299 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110325 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110355 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-node-pullsecrets\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110390 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-image-import-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110421 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110444 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-encryption-config\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110497 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dns2\" (UniqueName: \"kubernetes.io/projected/b9db5264-1b6b-4a6a-b799-9ae1c1323186-kube-api-access-4dns2\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.112242 master-0 kubenswrapper[7518]: I0319 09:20:38.110530 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.214901 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-trusted-ca-bundle\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215035 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit-dir\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215063 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215079 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215101 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-node-pullsecrets\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215123 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-image-import-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215145 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215160 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-encryption-config\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215183 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dns2\" (UniqueName: \"kubernetes.io/projected/b9db5264-1b6b-4a6a-b799-9ae1c1323186-kube-api-access-4dns2\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215218 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215494 master-0 kubenswrapper[7518]: I0319 09:20:38.215250 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-config\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.215998 master-0 kubenswrapper[7518]: I0319 09:20:38.215943 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-config\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: I0319 09:20:38.216982 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-trusted-ca-bundle\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: I0319 09:20:38.217036 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit-dir\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: E0319 09:20:38.217118 7518 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: E0319 09:20:38.217175 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.717158776 +0000 UTC m=+56.599742035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : secret "etcd-client" not found Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: E0319 09:20:38.217531 7518 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: E0319 09:20:38.217563 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.717553579 +0000 UTC m=+56.600136838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : secret "serving-cert" not found Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: I0319 09:20:38.217609 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-node-pullsecrets\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: I0319 09:20:38.217994 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-image-import-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: E0319 09:20:38.218045 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:38.218163 master-0 kubenswrapper[7518]: E0319 09:20:38.218076 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.718065777 +0000 UTC m=+56.600649036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "etcd-serving-ca" not found Mar 19 09:20:38.218672 master-0 kubenswrapper[7518]: E0319 09:20:38.218619 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:38.220510 master-0 kubenswrapper[7518]: E0319 09:20:38.218717 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:38.718689768 +0000 UTC m=+56.601273127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "audit-0" not found Mar 19 09:20:38.224081 master-0 kubenswrapper[7518]: I0319 09:20:38.224029 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-encryption-config\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.255130 master-0 kubenswrapper[7518]: I0319 09:20:38.255079 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dns2\" (UniqueName: \"kubernetes.io/projected/b9db5264-1b6b-4a6a-b799-9ae1c1323186-kube-api-access-4dns2\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.321508 master-0 kubenswrapper[7518]: I0319 09:20:38.320546 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcbab571-c45a-490f-bd7b-8d3c519e6d03" path="/var/lib/kubelet/pods/dcbab571-c45a-490f-bd7b-8d3c519e6d03/volumes" Mar 19 09:20:38.520043 master-0 kubenswrapper[7518]: I0319 09:20:38.520004 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert\") pod \"route-controller-manager-58bdf45c89-nnbc4\" (UID: \"33f8ede1-66c2-4a48-a9c9-32002408150f\") " pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:38.520256 master-0 kubenswrapper[7518]: E0319 09:20:38.520204 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:38.520332 master-0 kubenswrapper[7518]: E0319 09:20:38.520295 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert podName:33f8ede1-66c2-4a48-a9c9-32002408150f nodeName:}" failed. No retries permitted until 2026-03-19 09:20:40.520275633 +0000 UTC m=+58.402858882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert") pod "route-controller-manager-58bdf45c89-nnbc4" (UID: "33f8ede1-66c2-4a48-a9c9-32002408150f") : secret "serving-cert" not found Mar 19 09:20:38.650044 master-0 kubenswrapper[7518]: I0319 09:20:38.649999 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4" Mar 19 09:20:38.650859 master-0 kubenswrapper[7518]: I0319 09:20:38.650835 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerStarted","Data":"7d09aca9fefb402af8b2ae5b0086c54b39e7c40d8e4c2624e1555fd0e0a43d99"} Mar 19 09:20:38.650929 master-0 kubenswrapper[7518]: I0319 09:20:38.650897 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866d56f9b-6dc8n" Mar 19 09:20:38.723383 master-0 kubenswrapper[7518]: I0319 09:20:38.723255 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.723383 master-0 kubenswrapper[7518]: I0319 09:20:38.723326 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.723642 master-0 kubenswrapper[7518]: E0319 09:20:38.723493 7518 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:38.723642 master-0 kubenswrapper[7518]: E0319 09:20:38.723584 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:39.723564242 +0000 UTC m=+57.606147501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : secret "serving-cert" not found Mar 19 09:20:38.723726 master-0 kubenswrapper[7518]: I0319 09:20:38.723624 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.723774 master-0 kubenswrapper[7518]: I0319 09:20:38.723738 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:38.723898 master-0 kubenswrapper[7518]: E0319 09:20:38.723518 7518 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:38.723898 master-0 kubenswrapper[7518]: E0319 09:20:38.723908 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:39.723900133 +0000 UTC m=+57.606483392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : secret "etcd-client" not found Mar 19 09:20:38.724003 master-0 kubenswrapper[7518]: E0319 09:20:38.723968 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:38.724043 master-0 kubenswrapper[7518]: E0319 09:20:38.724023 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:39.724009206 +0000 UTC m=+57.606592465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "etcd-serving-ca" not found Mar 19 09:20:38.724043 master-0 kubenswrapper[7518]: E0319 09:20:38.724034 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:38.724122 master-0 kubenswrapper[7518]: E0319 09:20:38.724105 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:39.724084679 +0000 UTC m=+57.606668018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "audit-0" not found Mar 19 09:20:38.807456 master-0 kubenswrapper[7518]: I0319 09:20:38.807369 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" podStartSLOduration=2.438952905 podStartE2EDuration="4.807346477s" podCreationTimestamp="2026-03-19 09:20:34 +0000 UTC" firstStartedPulling="2026-03-19 09:20:36.100181682 +0000 UTC m=+53.982764941" lastFinishedPulling="2026-03-19 09:20:38.468575244 +0000 UTC m=+56.351158513" observedRunningTime="2026-03-19 09:20:38.769704763 +0000 UTC m=+56.652288032" watchObservedRunningTime="2026-03-19 09:20:38.807346477 +0000 UTC m=+56.689929736" Mar 19 09:20:38.822223 master-0 kubenswrapper[7518]: I0319 09:20:38.822175 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866d56f9b-6dc8n"] Mar 19 09:20:38.825595 master-0 kubenswrapper[7518]: I0319 09:20:38.825557 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:38.826073 master-0 kubenswrapper[7518]: I0319 09:20:38.826032 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:38.826166 master-0 kubenswrapper[7518]: I0319 09:20:38.826115 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:38.826401 master-0 kubenswrapper[7518]: E0319 09:20:38.826337 7518 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:38.826462 master-0 kubenswrapper[7518]: E0319 09:20:38.826408 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:42.826391111 +0000 UTC m=+60.708974370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : configmap "etcd-serving-ca" not found Mar 19 09:20:38.826567 master-0 kubenswrapper[7518]: E0319 09:20:38.826550 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:38.826629 master-0 kubenswrapper[7518]: E0319 09:20:38.826592 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:42.826584167 +0000 UTC m=+60.709167566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "etcd-client" not found Mar 19 09:20:38.826629 master-0 kubenswrapper[7518]: E0319 09:20:38.826589 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:38.826723 master-0 kubenswrapper[7518]: E0319 09:20:38.826654 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:42.826638159 +0000 UTC m=+60.709221418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "serving-cert" not found Mar 19 09:20:38.839455 master-0 kubenswrapper[7518]: I0319 09:20:38.839411 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-866d56f9b-6dc8n"] Mar 19 09:20:38.877666 master-0 kubenswrapper[7518]: I0319 09:20:38.877312 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4"] Mar 19 09:20:38.896906 master-0 kubenswrapper[7518]: I0319 09:20:38.896750 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bdf45c89-nnbc4"] Mar 19 09:20:39.028990 master-0 kubenswrapper[7518]: I0319 09:20:39.028872 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfdbfc13-79f9-4369-990f-29b31f7ec8da-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:39.028990 master-0 kubenswrapper[7518]: I0319 09:20:39.028910 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33f8ede1-66c2-4a48-a9c9-32002408150f-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:39.737893 master-0 kubenswrapper[7518]: I0319 09:20:39.737830 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:39.737893 master-0 kubenswrapper[7518]: I0319 09:20:39.737883 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: E0319 09:20:39.738040 7518 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: I0319 09:20:39.738070 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: I0319 09:20:39.738112 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: E0319 09:20:39.738132 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:41.738110301 +0000 UTC m=+59.620693630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : secret "etcd-client" not found Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: E0319 09:20:39.738168 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: E0319 09:20:39.738224 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:41.738204654 +0000 UTC m=+59.620787983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "audit-0" not found Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: E0319 09:20:39.738324 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 09:20:39.738563 master-0 kubenswrapper[7518]: E0319 09:20:39.738432 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:41.738407842 +0000 UTC m=+59.620991111 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "etcd-serving-ca" not found Mar 19 09:20:39.742516 master-0 kubenswrapper[7518]: I0319 09:20:39.742422 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:40.170416 master-0 kubenswrapper[7518]: I0319 09:20:40.170243 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt"] Mar 19 09:20:40.171045 master-0 kubenswrapper[7518]: I0319 09:20:40.170943 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.171204 master-0 kubenswrapper[7518]: I0319 09:20:40.171148 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65dbf9584-tg7x7"] Mar 19 09:20:40.171837 master-0 kubenswrapper[7518]: I0319 09:20:40.171806 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.174072 master-0 kubenswrapper[7518]: I0319 09:20:40.173833 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:20:40.174415 master-0 kubenswrapper[7518]: I0319 09:20:40.174394 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:20:40.174566 master-0 kubenswrapper[7518]: I0319 09:20:40.174543 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:40.175101 master-0 kubenswrapper[7518]: I0319 09:20:40.175074 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:40.175695 master-0 kubenswrapper[7518]: I0319 09:20:40.175660 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:20:40.176513 master-0 kubenswrapper[7518]: I0319 09:20:40.176448 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:20:40.176736 master-0 kubenswrapper[7518]: I0319 09:20:40.176716 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:20:40.176792 master-0 kubenswrapper[7518]: I0319 09:20:40.176757 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:20:40.177182 master-0 kubenswrapper[7518]: I0319 09:20:40.177143 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:20:40.177491 master-0 kubenswrapper[7518]: I0319 09:20:40.177436 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:20:40.181797 master-0 kubenswrapper[7518]: I0319 09:20:40.181764 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:20:40.184615 master-0 kubenswrapper[7518]: I0319 09:20:40.184522 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt"] Mar 19 09:20:40.187746 master-0 kubenswrapper[7518]: I0319 09:20:40.187717 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65dbf9584-tg7x7"] Mar 19 09:20:40.321770 master-0 kubenswrapper[7518]: I0319 09:20:40.321704 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f8ede1-66c2-4a48-a9c9-32002408150f" path="/var/lib/kubelet/pods/33f8ede1-66c2-4a48-a9c9-32002408150f/volumes" Mar 19 09:20:40.322057 master-0 kubenswrapper[7518]: I0319 09:20:40.322030 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfdbfc13-79f9-4369-990f-29b31f7ec8da" path="/var/lib/kubelet/pods/cfdbfc13-79f9-4369-990f-29b31f7ec8da/volumes" Mar 19 09:20:40.344891 master-0 kubenswrapper[7518]: I0319 09:20:40.344819 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6chx\" (UniqueName: \"kubernetes.io/projected/7214416f-03b4-4507-918b-ca3c0c95773e-kube-api-access-m6chx\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.345083 master-0 kubenswrapper[7518]: I0319 09:20:40.344911 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf60b652-41e7-492a-a1f1-d6b2f9680f67-serving-cert\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.345083 master-0 kubenswrapper[7518]: I0319 09:20:40.344939 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ksf\" (UniqueName: \"kubernetes.io/projected/cf60b652-41e7-492a-a1f1-d6b2f9680f67-kube-api-access-g9ksf\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.345083 master-0 kubenswrapper[7518]: I0319 09:20:40.344983 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-config\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.345083 master-0 kubenswrapper[7518]: I0319 09:20:40.345059 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.345252 master-0 kubenswrapper[7518]: I0319 09:20:40.345134 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-config\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.345252 master-0 kubenswrapper[7518]: I0319 09:20:40.345217 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-proxy-ca-bundles\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.345324 master-0 kubenswrapper[7518]: I0319 09:20:40.345283 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-client-ca\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.345363 master-0 kubenswrapper[7518]: I0319 09:20:40.345321 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-client-ca\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.446160 master-0 kubenswrapper[7518]: I0319 09:20:40.446029 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-config\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.446522 master-0 kubenswrapper[7518]: I0319 09:20:40.446201 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.446522 master-0 kubenswrapper[7518]: I0319 09:20:40.446331 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-config\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.446522 master-0 kubenswrapper[7518]: E0319 09:20:40.446350 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:40.446522 master-0 kubenswrapper[7518]: E0319 09:20:40.446429 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert podName:7214416f-03b4-4507-918b-ca3c0c95773e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:40.946409828 +0000 UTC m=+58.828993087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert") pod "route-controller-manager-58559b7f6c-j4rrt" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e") : secret "serving-cert" not found Mar 19 09:20:40.446812 master-0 kubenswrapper[7518]: I0319 09:20:40.446768 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-proxy-ca-bundles\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.446879 master-0 kubenswrapper[7518]: I0319 09:20:40.446838 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-client-ca\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.446943 master-0 kubenswrapper[7518]: I0319 09:20:40.446885 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-client-ca\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.446943 master-0 kubenswrapper[7518]: I0319 09:20:40.446941 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6chx\" (UniqueName: \"kubernetes.io/projected/7214416f-03b4-4507-918b-ca3c0c95773e-kube-api-access-m6chx\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.447082 master-0 kubenswrapper[7518]: I0319 09:20:40.446995 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf60b652-41e7-492a-a1f1-d6b2f9680f67-serving-cert\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.447082 master-0 kubenswrapper[7518]: I0319 09:20:40.447015 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ksf\" (UniqueName: \"kubernetes.io/projected/cf60b652-41e7-492a-a1f1-d6b2f9680f67-kube-api-access-g9ksf\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.447239 master-0 kubenswrapper[7518]: I0319 09:20:40.447162 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-config\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.448034 master-0 kubenswrapper[7518]: I0319 09:20:40.447750 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-config\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.448034 master-0 kubenswrapper[7518]: I0319 09:20:40.447844 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-client-ca\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.448669 master-0 kubenswrapper[7518]: I0319 09:20:40.448628 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-client-ca\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.449819 master-0 kubenswrapper[7518]: I0319 09:20:40.449772 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-proxy-ca-bundles\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.451964 master-0 kubenswrapper[7518]: I0319 09:20:40.451943 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf60b652-41e7-492a-a1f1-d6b2f9680f67-serving-cert\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.467001 master-0 kubenswrapper[7518]: I0319 09:20:40.466934 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6chx\" (UniqueName: \"kubernetes.io/projected/7214416f-03b4-4507-918b-ca3c0c95773e-kube-api-access-m6chx\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.467935 master-0 kubenswrapper[7518]: I0319 09:20:40.467886 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ksf\" (UniqueName: \"kubernetes.io/projected/cf60b652-41e7-492a-a1f1-d6b2f9680f67-kube-api-access-g9ksf\") pod \"controller-manager-65dbf9584-tg7x7\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.519728 master-0 kubenswrapper[7518]: I0319 09:20:40.519676 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:40.547014 master-0 kubenswrapper[7518]: I0319 09:20:40.545199 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:20:40.714566 master-0 kubenswrapper[7518]: I0319 09:20:40.714401 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65dbf9584-tg7x7"] Mar 19 09:20:40.722929 master-0 kubenswrapper[7518]: W0319 09:20:40.722880 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf60b652_41e7_492a_a1f1_d6b2f9680f67.slice/crio-88434a59a7308bc36e38b535d7d9d2585acc58eac032cef32588d420be3ca90a WatchSource:0}: Error finding container 88434a59a7308bc36e38b535d7d9d2585acc58eac032cef32588d420be3ca90a: Status 404 returned error can't find the container with id 88434a59a7308bc36e38b535d7d9d2585acc58eac032cef32588d420be3ca90a Mar 19 09:20:40.952715 master-0 kubenswrapper[7518]: I0319 09:20:40.952656 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:40.953544 master-0 kubenswrapper[7518]: E0319 09:20:40.952862 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:40.953544 master-0 kubenswrapper[7518]: E0319 09:20:40.953111 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert podName:7214416f-03b4-4507-918b-ca3c0c95773e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:41.953053012 +0000 UTC m=+59.835636301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert") pod "route-controller-manager-58559b7f6c-j4rrt" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e") : secret "serving-cert" not found Mar 19 09:20:41.671230 master-0 kubenswrapper[7518]: I0319 09:20:41.671171 7518 generic.go:334] "Generic (PLEG): container finished" podID="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" containerID="a481a6ff530440a1264d2535843bd9da5aad52194298733f7093828af5a8bb83" exitCode=0 Mar 19 09:20:41.671604 master-0 kubenswrapper[7518]: I0319 09:20:41.671259 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" event={"ID":"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8","Type":"ContainerDied","Data":"a481a6ff530440a1264d2535843bd9da5aad52194298733f7093828af5a8bb83"} Mar 19 09:20:41.671688 master-0 kubenswrapper[7518]: I0319 09:20:41.671666 7518 scope.go:117] "RemoveContainer" containerID="a481a6ff530440a1264d2535843bd9da5aad52194298733f7093828af5a8bb83" Mar 19 09:20:41.672900 master-0 kubenswrapper[7518]: I0319 09:20:41.672757 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" event={"ID":"cf60b652-41e7-492a-a1f1-d6b2f9680f67","Type":"ContainerStarted","Data":"88434a59a7308bc36e38b535d7d9d2585acc58eac032cef32588d420be3ca90a"} Mar 19 09:20:41.759835 master-0 kubenswrapper[7518]: I0319 09:20:41.759613 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:41.759835 master-0 kubenswrapper[7518]: I0319 09:20:41.759717 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:41.759835 master-0 kubenswrapper[7518]: I0319 09:20:41.759771 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:41.760642 master-0 kubenswrapper[7518]: E0319 09:20:41.759895 7518 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 09:20:41.760642 master-0 kubenswrapper[7518]: E0319 09:20:41.759975 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:45.759947965 +0000 UTC m=+63.642531224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : configmap "audit-0" not found Mar 19 09:20:41.760642 master-0 kubenswrapper[7518]: E0319 09:20:41.760200 7518 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:41.760642 master-0 kubenswrapper[7518]: E0319 09:20:41.760279 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client podName:b9db5264-1b6b-4a6a-b799-9ae1c1323186 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:45.760255156 +0000 UTC m=+63.642838615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client") pod "apiserver-c765cd67b-cvhxl" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186") : secret "etcd-client" not found Mar 19 09:20:41.761590 master-0 kubenswrapper[7518]: I0319 09:20:41.761405 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"apiserver-c765cd67b-cvhxl\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:41.964291 master-0 kubenswrapper[7518]: I0319 09:20:41.964228 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:41.965216 master-0 kubenswrapper[7518]: E0319 09:20:41.964462 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:41.965216 master-0 kubenswrapper[7518]: E0319 09:20:41.964533 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert podName:7214416f-03b4-4507-918b-ca3c0c95773e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:43.964515817 +0000 UTC m=+61.847099086 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert") pod "route-controller-manager-58559b7f6c-j4rrt" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e") : secret "serving-cert" not found Mar 19 09:20:42.678732 master-0 kubenswrapper[7518]: I0319 09:20:42.678417 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" event={"ID":"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8","Type":"ContainerStarted","Data":"1ab7108f2d0d95e899700fa23b90ec0d13089f5f042d9994efddbfeb30eefd68"} Mar 19 09:20:42.679563 master-0 kubenswrapper[7518]: I0319 09:20:42.679526 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:20:42.875750 master-0 kubenswrapper[7518]: I0319 09:20:42.875646 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:42.875750 master-0 kubenswrapper[7518]: I0319 09:20:42.875761 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:42.876134 master-0 kubenswrapper[7518]: E0319 09:20:42.875794 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:42.876134 master-0 kubenswrapper[7518]: E0319 09:20:42.875866 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:50.875847533 +0000 UTC m=+68.758430873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "serving-cert" not found Mar 19 09:20:42.876134 master-0 kubenswrapper[7518]: I0319 09:20:42.875799 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:42.876289 master-0 kubenswrapper[7518]: E0319 09:20:42.876172 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/etcd-client: secret "etcd-client" not found Mar 19 09:20:42.876289 master-0 kubenswrapper[7518]: E0319 09:20:42.876200 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client podName:f6bc6cad-d4ba-4d22-b9c9-117c91de19a1 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:50.876191096 +0000 UTC m=+68.758774435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client") pod "apiserver-5547669f67-dhd9c" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1") : secret "etcd-client" not found Mar 19 09:20:42.876570 master-0 kubenswrapper[7518]: I0319 09:20:42.876530 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"apiserver-5547669f67-dhd9c\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:43.006811 master-0 kubenswrapper[7518]: I0319 09:20:43.006624 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:20:43.197278 master-0 kubenswrapper[7518]: I0319 09:20:43.192875 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-cvhxl"] Mar 19 09:20:43.197278 master-0 kubenswrapper[7518]: E0319 09:20:43.193186 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit etcd-client], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" podUID="b9db5264-1b6b-4a6a-b799-9ae1c1323186" Mar 19 09:20:43.681106 master-0 kubenswrapper[7518]: I0319 09:20:43.681055 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:43.688528 master-0 kubenswrapper[7518]: I0319 09:20:43.688289 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.793901 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-image-import-ca\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.793956 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit-dir\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.793992 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-node-pullsecrets\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.794014 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.794039 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-trusted-ca-bundle\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.794094 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.794122 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-config\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.794144 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-encryption-config\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.794178 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dns2\" (UniqueName: \"kubernetes.io/projected/b9db5264-1b6b-4a6a-b799-9ae1c1323186-kube-api-access-4dns2\") pod \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\" (UID: \"b9db5264-1b6b-4a6a-b799-9ae1c1323186\") " Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.795625 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.796203 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.796237 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.796269 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.796903 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-config" (OuterVolumeSpecName: "config") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:43.797557 master-0 kubenswrapper[7518]: I0319 09:20:43.797368 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:43.801648 master-0 kubenswrapper[7518]: I0319 09:20:43.799483 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:43.801648 master-0 kubenswrapper[7518]: I0319 09:20:43.801558 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:43.801648 master-0 kubenswrapper[7518]: I0319 09:20:43.801606 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9db5264-1b6b-4a6a-b799-9ae1c1323186-kube-api-access-4dns2" (OuterVolumeSpecName: "kube-api-access-4dns2") pod "b9db5264-1b6b-4a6a-b799-9ae1c1323186" (UID: "b9db5264-1b6b-4a6a-b799-9ae1c1323186"). InnerVolumeSpecName "kube-api-access-4dns2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.895898 7518 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.895946 7518 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.895959 7518 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b9db5264-1b6b-4a6a-b799-9ae1c1323186-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.895970 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.895984 7518 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.895998 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.896009 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.896021 7518 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.896494 master-0 kubenswrapper[7518]: I0319 09:20:43.896033 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dns2\" (UniqueName: \"kubernetes.io/projected/b9db5264-1b6b-4a6a-b799-9ae1c1323186-kube-api-access-4dns2\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:43.997628 master-0 kubenswrapper[7518]: I0319 09:20:43.997449 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:43.997853 master-0 kubenswrapper[7518]: E0319 09:20:43.997824 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:43.997919 master-0 kubenswrapper[7518]: E0319 09:20:43.997905 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert podName:7214416f-03b4-4507-918b-ca3c0c95773e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.997881061 +0000 UTC m=+65.880464320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert") pod "route-controller-manager-58559b7f6c-j4rrt" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e") : secret "serving-cert" not found Mar 19 09:20:44.690769 master-0 kubenswrapper[7518]: I0319 09:20:44.690385 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" event={"ID":"cf60b652-41e7-492a-a1f1-d6b2f9680f67","Type":"ContainerStarted","Data":"4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3"} Mar 19 09:20:44.690769 master-0 kubenswrapper[7518]: I0319 09:20:44.690751 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:44.692335 master-0 kubenswrapper[7518]: I0319 09:20:44.690873 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-cvhxl" Mar 19 09:20:44.697216 master-0 kubenswrapper[7518]: I0319 09:20:44.697163 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:44.921437 master-0 kubenswrapper[7518]: I0319 09:20:44.921173 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" podStartSLOduration=3.46363742 podStartE2EDuration="6.921147852s" podCreationTimestamp="2026-03-19 09:20:38 +0000 UTC" firstStartedPulling="2026-03-19 09:20:40.726235947 +0000 UTC m=+58.608819206" lastFinishedPulling="2026-03-19 09:20:44.183746379 +0000 UTC m=+62.066329638" observedRunningTime="2026-03-19 09:20:44.917778468 +0000 UTC m=+62.800361747" watchObservedRunningTime="2026-03-19 09:20:44.921147852 +0000 UTC m=+62.803731111" Mar 19 09:20:45.036502 master-0 kubenswrapper[7518]: I0319 09:20:45.034072 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-dhd9c"] Mar 19 09:20:45.036502 master-0 kubenswrapper[7518]: E0319 09:20:45.034518 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etcd-client serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" podUID="f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" Mar 19 09:20:45.217969 master-0 kubenswrapper[7518]: I0319 09:20:45.217914 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-66c44d7ccf-z4ssv"] Mar 19 09:20:45.218689 master-0 kubenswrapper[7518]: I0319 09:20:45.218660 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.222815 master-0 kubenswrapper[7518]: I0319 09:20:45.222751 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:20:45.222815 master-0 kubenswrapper[7518]: I0319 09:20:45.222770 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:20:45.223014 master-0 kubenswrapper[7518]: I0319 09:20:45.222774 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:45.223014 master-0 kubenswrapper[7518]: I0319 09:20:45.222932 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:20:45.223014 master-0 kubenswrapper[7518]: I0319 09:20:45.222978 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:20:45.223014 master-0 kubenswrapper[7518]: I0319 09:20:45.223046 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:20:45.223229 master-0 kubenswrapper[7518]: I0319 09:20:45.223155 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:20:45.223415 master-0 kubenswrapper[7518]: I0319 09:20:45.223393 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:20:45.224343 master-0 kubenswrapper[7518]: I0319 09:20:45.224287 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:20:45.230647 master-0 kubenswrapper[7518]: I0319 09:20:45.230584 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-cvhxl"] Mar 19 09:20:45.232329 master-0 kubenswrapper[7518]: I0319 09:20:45.232284 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:20:45.256496 master-0 kubenswrapper[7518]: I0319 09:20:45.255440 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-66c44d7ccf-z4ssv"] Mar 19 09:20:45.263493 master-0 kubenswrapper[7518]: I0319 09:20:45.262479 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-cvhxl"] Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273048 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-audit-dir\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273149 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbxp\" (UniqueName: \"kubernetes.io/projected/37533d4d-1eed-4f61-853e-4536958bf13a-kube-api-access-bnbxp\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273181 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-audit\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273221 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-image-import-ca\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273274 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-trusted-ca-bundle\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273296 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-serving-ca\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273311 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-client\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273324 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-encryption-config\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273350 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-node-pullsecrets\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273377 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-serving-cert\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273402 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-config\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273455 7518 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b9db5264-1b6b-4a6a-b799-9ae1c1323186-audit\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:45.275486 master-0 kubenswrapper[7518]: I0319 09:20:45.273481 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9db5264-1b6b-4a6a-b799-9ae1c1323186-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:45.374918 master-0 kubenswrapper[7518]: I0319 09:20:45.374829 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-trusted-ca-bundle\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.375342 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-serving-ca\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.375437 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-client\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.375501 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-encryption-config\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376241 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-serving-ca\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376430 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-node-pullsecrets\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376502 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-serving-cert\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376562 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-config\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376626 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-audit-dir\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376731 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbxp\" (UniqueName: \"kubernetes.io/projected/37533d4d-1eed-4f61-853e-4536958bf13a-kube-api-access-bnbxp\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376777 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-audit\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.376831 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-image-import-ca\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.377340 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-image-import-ca\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.377406 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-audit-dir\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.378108 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-config\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.378203 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-trusted-ca-bundle\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.378214 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-node-pullsecrets\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.378308 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-audit\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.379283 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-encryption-config\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.380859 master-0 kubenswrapper[7518]: I0319 09:20:45.379768 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-client\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.381519 master-0 kubenswrapper[7518]: I0319 09:20:45.381295 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-serving-cert\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.396148 master-0 kubenswrapper[7518]: I0319 09:20:45.396085 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbxp\" (UniqueName: \"kubernetes.io/projected/37533d4d-1eed-4f61-853e-4536958bf13a-kube-api-access-bnbxp\") pod \"apiserver-66c44d7ccf-z4ssv\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.541505 master-0 kubenswrapper[7518]: I0319 09:20:45.541318 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:20:45.693629 master-0 kubenswrapper[7518]: I0319 09:20:45.693574 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:45.700535 master-0 kubenswrapper[7518]: I0319 09:20:45.700496 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:45.889943 master-0 kubenswrapper[7518]: I0319 09:20:45.889400 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-trusted-ca-bundle\") pod \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " Mar 19 09:20:45.889943 master-0 kubenswrapper[7518]: I0319 09:20:45.889950 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-encryption-config\") pod \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " Mar 19 09:20:45.890197 master-0 kubenswrapper[7518]: I0319 09:20:45.890019 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") pod \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " Mar 19 09:20:45.890241 master-0 kubenswrapper[7518]: I0319 09:20:45.890051 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-dir\") pod \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " Mar 19 09:20:45.890317 master-0 kubenswrapper[7518]: I0319 09:20:45.890281 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-policies\") pod \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " Mar 19 09:20:45.890403 master-0 kubenswrapper[7518]: I0319 09:20:45.890370 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w7jg\" (UniqueName: \"kubernetes.io/projected/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-kube-api-access-4w7jg\") pod \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\" (UID: \"f6bc6cad-d4ba-4d22-b9c9-117c91de19a1\") " Mar 19 09:20:45.890484 master-0 kubenswrapper[7518]: I0319 09:20:45.890425 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:45.890570 master-0 kubenswrapper[7518]: I0319 09:20:45.890431 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:20:45.890941 master-0 kubenswrapper[7518]: I0319 09:20:45.890902 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:45.890941 master-0 kubenswrapper[7518]: I0319 09:20:45.890933 7518 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:45.891261 master-0 kubenswrapper[7518]: I0319 09:20:45.891201 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:45.892899 master-0 kubenswrapper[7518]: I0319 09:20:45.892080 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:45.996027 master-0 kubenswrapper[7518]: I0319 09:20:45.995972 7518 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:45.996027 master-0 kubenswrapper[7518]: I0319 09:20:45.996018 7518 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:46.113711 master-0 kubenswrapper[7518]: I0319 09:20:46.113569 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-kube-api-access-4w7jg" (OuterVolumeSpecName: "kube-api-access-4w7jg") pod "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1"). InnerVolumeSpecName "kube-api-access-4w7jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:46.115821 master-0 kubenswrapper[7518]: I0319 09:20:46.115651 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" (UID: "f6bc6cad-d4ba-4d22-b9c9-117c91de19a1"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:46.199172 master-0 kubenswrapper[7518]: I0319 09:20:46.199106 7518 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:46.199172 master-0 kubenswrapper[7518]: I0319 09:20:46.199163 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w7jg\" (UniqueName: \"kubernetes.io/projected/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-kube-api-access-4w7jg\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:46.344997 master-0 kubenswrapper[7518]: I0319 09:20:46.344943 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9db5264-1b6b-4a6a-b799-9ae1c1323186" path="/var/lib/kubelet/pods/b9db5264-1b6b-4a6a-b799-9ae1c1323186/volumes" Mar 19 09:20:46.537870 master-0 kubenswrapper[7518]: I0319 09:20:46.537824 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-66c44d7ccf-z4ssv"] Mar 19 09:20:46.583032 master-0 kubenswrapper[7518]: I0319 09:20:46.582964 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:20:46.697178 master-0 kubenswrapper[7518]: I0319 09:20:46.697122 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-dhd9c" Mar 19 09:20:47.139616 master-0 kubenswrapper[7518]: I0319 09:20:47.139461 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7558b877c5-pb68b"] Mar 19 09:20:47.146619 master-0 kubenswrapper[7518]: I0319 09:20:47.143388 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-dhd9c"] Mar 19 09:20:47.146619 master-0 kubenswrapper[7518]: I0319 09:20:47.143543 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.146619 master-0 kubenswrapper[7518]: I0319 09:20:47.145898 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:47.149062 master-0 kubenswrapper[7518]: I0319 09:20:47.148870 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:20:47.149165 master-0 kubenswrapper[7518]: I0319 09:20:47.149141 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:20:47.149639 master-0 kubenswrapper[7518]: I0319 09:20:47.149337 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:20:47.149639 master-0 kubenswrapper[7518]: I0319 09:20:47.149338 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:20:47.149639 master-0 kubenswrapper[7518]: I0319 09:20:47.149412 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:20:47.149639 master-0 kubenswrapper[7518]: I0319 09:20:47.149453 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:20:47.149639 master-0 kubenswrapper[7518]: I0319 09:20:47.149507 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:20:47.246338 master-0 kubenswrapper[7518]: I0319 09:20:47.246302 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-dhd9c"] Mar 19 09:20:47.248618 master-0 kubenswrapper[7518]: I0319 09:20:47.248575 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7558b877c5-pb68b"] Mar 19 09:20:47.251139 master-0 kubenswrapper[7518]: I0319 09:20:47.251078 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251235 master-0 kubenswrapper[7518]: I0319 09:20:47.251153 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-client\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251235 master-0 kubenswrapper[7518]: I0319 09:20:47.251203 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-serving-ca\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251338 master-0 kubenswrapper[7518]: I0319 09:20:47.251240 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-trusted-ca-bundle\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251338 master-0 kubenswrapper[7518]: I0319 09:20:47.251301 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-dir\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251338 master-0 kubenswrapper[7518]: I0319 09:20:47.251327 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-policies\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251459 master-0 kubenswrapper[7518]: I0319 09:20:47.251350 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trwr\" (UniqueName: \"kubernetes.io/projected/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-kube-api-access-5trwr\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.251459 master-0 kubenswrapper[7518]: I0319 09:20:47.251411 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-encryption-config\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.353340 master-0 kubenswrapper[7518]: I0319 09:20:47.353228 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-policies\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.353764 master-0 kubenswrapper[7518]: I0319 09:20:47.353491 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trwr\" (UniqueName: \"kubernetes.io/projected/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-kube-api-access-5trwr\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.353764 master-0 kubenswrapper[7518]: I0319 09:20:47.353548 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:47.353764 master-0 kubenswrapper[7518]: I0319 09:20:47.353580 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:20:47.353764 master-0 kubenswrapper[7518]: I0319 09:20:47.353609 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-mc76b\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:20:47.353764 master-0 kubenswrapper[7518]: I0319 09:20:47.353639 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-encryption-config\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.353989 master-0 kubenswrapper[7518]: I0319 09:20:47.353892 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:20:47.354039 master-0 kubenswrapper[7518]: I0319 09:20:47.354003 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:20:47.354089 master-0 kubenswrapper[7518]: I0319 09:20:47.354059 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:47.354156 master-0 kubenswrapper[7518]: E0319 09:20:47.354092 7518 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 09:20:47.354201 master-0 kubenswrapper[7518]: E0319 09:20:47.354169 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs podName:bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d nodeName:}" failed. No retries permitted until 2026-03-19 09:21:51.354147558 +0000 UTC m=+129.236730987 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-mc76b" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d") : secret "multus-admission-controller-secret" not found Mar 19 09:20:47.354266 master-0 kubenswrapper[7518]: E0319 09:20:47.354197 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 09:20:47.354374 master-0 kubenswrapper[7518]: I0319 09:20:47.354308 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:20:47.354433 master-0 kubenswrapper[7518]: E0319 09:20:47.354336 7518 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:47.354485 master-0 kubenswrapper[7518]: E0319 09:20:47.354377 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert podName:208939f5-8fca-4fd5-b0c6-43484b7d1e30 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:51.354359975 +0000 UTC m=+129.236943424 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert") pod "catalog-operator-68f85b4d6c-j92kd" (UID: "208939f5-8fca-4fd5-b0c6-43484b7d1e30") : secret "catalog-operator-serving-cert" not found Mar 19 09:20:47.354596 master-0 kubenswrapper[7518]: I0319 09:20:47.354525 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:20:47.354643 master-0 kubenswrapper[7518]: I0319 09:20:47.354626 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:20:47.354697 master-0 kubenswrapper[7518]: I0319 09:20:47.354661 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.354746 master-0 kubenswrapper[7518]: I0319 09:20:47.354727 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-client\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.354746 master-0 kubenswrapper[7518]: E0319 09:20:47.354736 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls podName:7ad3ef11-90df-40b1-acbf-ed9b0c708ddb nodeName:}" failed. No retries permitted until 2026-03-19 09:21:51.354722007 +0000 UTC m=+129.237305466 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-z2869" (UID: "7ad3ef11-90df-40b1-acbf-ed9b0c708ddb") : secret "cluster-monitoring-operator-tls" not found Mar 19 09:20:47.354843 master-0 kubenswrapper[7518]: I0319 09:20:47.354770 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:20:47.354843 master-0 kubenswrapper[7518]: I0319 09:20:47.354810 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:20:47.354932 master-0 kubenswrapper[7518]: I0319 09:20:47.354846 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-serving-ca\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.354932 master-0 kubenswrapper[7518]: I0319 09:20:47.354879 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-trusted-ca-bundle\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.355014 master-0 kubenswrapper[7518]: E0319 09:20:47.355003 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:47.355057 master-0 kubenswrapper[7518]: E0319 09:20:47.355021 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 09:20:47.355215 master-0 kubenswrapper[7518]: E0319 09:20:47.355037 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert podName:cabb0e91-c3ad-4142-8834-06fd8d55c0b7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:47.855027877 +0000 UTC m=+65.737611136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert") pod "apiserver-7558b877c5-pb68b" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7") : secret "serving-cert" not found Mar 19 09:20:47.355215 master-0 kubenswrapper[7518]: E0319 09:20:47.355138 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert podName:8aa0f17a-287e-4a19-9a59-4913e7707071 nodeName:}" failed. No retries permitted until 2026-03-19 09:21:51.35511309 +0000 UTC m=+129.237696419 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert") pod "olm-operator-5c9796789-wjbt2" (UID: "8aa0f17a-287e-4a19-9a59-4913e7707071") : secret "olm-operator-serving-cert" not found Mar 19 09:20:47.355215 master-0 kubenswrapper[7518]: E0319 09:20:47.355163 7518 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 09:20:47.355215 master-0 kubenswrapper[7518]: E0319 09:20:47.355195 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert podName:1f2148fe-f9f6-47da-894c-b88dae360ebe nodeName:}" failed. No retries permitted until 2026-03-19 09:21:51.355185864 +0000 UTC m=+129.237769303 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-gltb5" (UID: "1f2148fe-f9f6-47da-894c-b88dae360ebe") : secret "package-server-manager-serving-cert" not found Mar 19 09:20:47.355409 master-0 kubenswrapper[7518]: I0319 09:20:47.355223 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:20:47.355409 master-0 kubenswrapper[7518]: I0319 09:20:47.355262 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:20:47.355409 master-0 kubenswrapper[7518]: I0319 09:20:47.355290 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-dir\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.355409 master-0 kubenswrapper[7518]: I0319 09:20:47.355364 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:47.355409 master-0 kubenswrapper[7518]: I0319 09:20:47.355381 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:47.355656 master-0 kubenswrapper[7518]: I0319 09:20:47.355428 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-dir\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.355656 master-0 kubenswrapper[7518]: I0319 09:20:47.355481 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-trusted-ca-bundle\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.355656 master-0 kubenswrapper[7518]: E0319 09:20:47.355516 7518 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 09:20:47.355656 master-0 kubenswrapper[7518]: E0319 09:20:47.355562 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs podName:4256d841-23cb-4756-b827-f44ee6e54def nodeName:}" failed. No retries permitted until 2026-03-19 09:21:51.355548976 +0000 UTC m=+129.238132315 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs") pod "network-metrics-daemon-p76jz" (UID: "4256d841-23cb-4756-b827-f44ee6e54def") : secret "metrics-daemon-secret" not found Mar 19 09:20:47.355882 master-0 kubenswrapper[7518]: I0319 09:20:47.355741 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-policies\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.355930 master-0 kubenswrapper[7518]: I0319 09:20:47.355875 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-serving-ca\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.358295 master-0 kubenswrapper[7518]: I0319 09:20:47.358238 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:47.358422 master-0 kubenswrapper[7518]: I0319 09:20:47.358375 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:20:47.359093 master-0 kubenswrapper[7518]: I0319 09:20:47.359038 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-encryption-config\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.359385 master-0 kubenswrapper[7518]: I0319 09:20:47.359328 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:47.359456 master-0 kubenswrapper[7518]: I0319 09:20:47.359404 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:20:47.359456 master-0 kubenswrapper[7518]: I0319 09:20:47.359452 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-client\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.359584 master-0 kubenswrapper[7518]: I0319 09:20:47.359501 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:20:47.360422 master-0 kubenswrapper[7518]: I0319 09:20:47.360371 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:20:47.361167 master-0 kubenswrapper[7518]: I0319 09:20:47.361122 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"cluster-version-operator-56d8475767-sbhx2\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:20:47.438425 master-0 kubenswrapper[7518]: I0319 09:20:47.438341 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:20:47.438815 master-0 kubenswrapper[7518]: I0319 09:20:47.438434 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:20:47.439047 master-0 kubenswrapper[7518]: I0319 09:20:47.438481 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:20:47.439047 master-0 kubenswrapper[7518]: I0319 09:20:47.438963 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:20:47.439147 master-0 kubenswrapper[7518]: I0319 09:20:47.438544 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:20:47.439147 master-0 kubenswrapper[7518]: I0319 09:20:47.438629 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:20:47.462166 master-0 kubenswrapper[7518]: W0319 09:20:47.462086 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b1ae47_ef83_448d_b40d_a836cb6c6fc0.slice/crio-e8d72b34e27d40c589a01f72d5d166b2daee8cc6371b989889cbb67dad2e3fcc WatchSource:0}: Error finding container e8d72b34e27d40c589a01f72d5d166b2daee8cc6371b989889cbb67dad2e3fcc: Status 404 returned error can't find the container with id e8d72b34e27d40c589a01f72d5d166b2daee8cc6371b989889cbb67dad2e3fcc Mar 19 09:20:47.525487 master-0 kubenswrapper[7518]: I0319 09:20:47.525425 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:47.526132 master-0 kubenswrapper[7518]: I0319 09:20:47.526104 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.529778 master-0 kubenswrapper[7518]: I0319 09:20:47.527874 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:20:47.538727 master-0 kubenswrapper[7518]: I0319 09:20:47.537655 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trwr\" (UniqueName: \"kubernetes.io/projected/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-kube-api-access-5trwr\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.584298 master-0 kubenswrapper[7518]: I0319 09:20:47.584237 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:47.665018 master-0 kubenswrapper[7518]: I0319 09:20:47.664289 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-var-lock\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.665018 master-0 kubenswrapper[7518]: I0319 09:20:47.664368 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8edda930-b012-4f1f-977a-a71ef8763fe3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.665018 master-0 kubenswrapper[7518]: I0319 09:20:47.664407 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.718136 master-0 kubenswrapper[7518]: I0319 09:20:47.718085 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" event={"ID":"32b1ae47-ef83-448d-b40d-a836cb6c6fc0","Type":"ContainerStarted","Data":"e8d72b34e27d40c589a01f72d5d166b2daee8cc6371b989889cbb67dad2e3fcc"} Mar 19 09:20:47.719792 master-0 kubenswrapper[7518]: I0319 09:20:47.719762 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" event={"ID":"37533d4d-1eed-4f61-853e-4536958bf13a","Type":"ContainerStarted","Data":"7cff4d9dd2b2d25ee0021fa3a61334c6c54bbb2fd78b56037cdd3dc4bff919b6"} Mar 19 09:20:47.765870 master-0 kubenswrapper[7518]: I0319 09:20:47.765290 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.765870 master-0 kubenswrapper[7518]: I0319 09:20:47.765405 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-var-lock\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.765870 master-0 kubenswrapper[7518]: I0319 09:20:47.765459 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8edda930-b012-4f1f-977a-a71ef8763fe3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.765870 master-0 kubenswrapper[7518]: I0319 09:20:47.765645 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.765870 master-0 kubenswrapper[7518]: I0319 09:20:47.765731 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-var-lock\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.866501 master-0 kubenswrapper[7518]: I0319 09:20:47.866050 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:47.866501 master-0 kubenswrapper[7518]: E0319 09:20:47.866262 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:47.866501 master-0 kubenswrapper[7518]: E0319 09:20:47.866322 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert podName:cabb0e91-c3ad-4142-8834-06fd8d55c0b7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:48.866305159 +0000 UTC m=+66.748888418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert") pod "apiserver-7558b877c5-pb68b" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7") : secret "serving-cert" not found Mar 19 09:20:47.868426 master-0 kubenswrapper[7518]: I0319 09:20:47.868315 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc"] Mar 19 09:20:47.875157 master-0 kubenswrapper[7518]: I0319 09:20:47.875054 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-6qck2"] Mar 19 09:20:47.877116 master-0 kubenswrapper[7518]: W0319 09:20:47.877069 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda417fe25_4aca_471c_941d_c195b6141042.slice/crio-477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991 WatchSource:0}: Error finding container 477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991: Status 404 returned error can't find the container with id 477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991 Mar 19 09:20:47.879112 master-0 kubenswrapper[7518]: W0319 09:20:47.879064 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e92e5d_61ea_45b2_b357_ebffdaebf4af.slice/crio-50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500 WatchSource:0}: Error finding container 50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500: Status 404 returned error can't find the container with id 50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500 Mar 19 09:20:47.886032 master-0 kubenswrapper[7518]: I0319 09:20:47.885819 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8edda930-b012-4f1f-977a-a71ef8763fe3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:47.896560 master-0 kubenswrapper[7518]: I0319 09:20:47.895287 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg"] Mar 19 09:20:47.904534 master-0 kubenswrapper[7518]: I0319 09:20:47.904458 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:47.920596 master-0 kubenswrapper[7518]: I0319 09:20:47.909963 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:20:47.920596 master-0 kubenswrapper[7518]: I0319 09:20:47.911803 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:20:47.920596 master-0 kubenswrapper[7518]: I0319 09:20:47.920318 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:20:47.964605 master-0 kubenswrapper[7518]: I0319 09:20:47.961280 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg"] Mar 19 09:20:47.969146 master-0 kubenswrapper[7518]: I0319 09:20:47.969057 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:47.969377 master-0 kubenswrapper[7518]: I0319 09:20:47.969266 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db42b38e-294e-4016-8ac1-54126ac60de8-cache\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:47.969377 master-0 kubenswrapper[7518]: I0319 09:20:47.969337 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:47.969480 master-0 kubenswrapper[7518]: I0319 09:20:47.969410 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dwx6\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-kube-api-access-8dwx6\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:47.969480 master-0 kubenswrapper[7518]: I0319 09:20:47.969434 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.034273 master-0 kubenswrapper[7518]: I0319 09:20:48.034215 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws"] Mar 19 09:20:48.034958 master-0 kubenswrapper[7518]: I0319 09:20:48.034935 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.041498 master-0 kubenswrapper[7518]: I0319 09:20:48.041443 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:20:48.041676 master-0 kubenswrapper[7518]: I0319 09:20:48.041635 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:20:48.041848 master-0 kubenswrapper[7518]: I0319 09:20:48.041822 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:20:48.050685 master-0 kubenswrapper[7518]: I0319 09:20:48.050174 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:20:48.069905 master-0 kubenswrapper[7518]: I0319 09:20:48.069784 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws"] Mar 19 09:20:48.071753 master-0 kubenswrapper[7518]: I0319 09:20:48.071717 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:48.071852 master-0 kubenswrapper[7518]: I0319 09:20:48.071782 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db42b38e-294e-4016-8ac1-54126ac60de8-cache\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.071852 master-0 kubenswrapper[7518]: I0319 09:20:48.071814 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.071941 master-0 kubenswrapper[7518]: I0319 09:20:48.071897 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dwx6\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-kube-api-access-8dwx6\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.071941 master-0 kubenswrapper[7518]: I0319 09:20:48.071926 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.072012 master-0 kubenswrapper[7518]: I0319 09:20:48.071945 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.072059 master-0 kubenswrapper[7518]: I0319 09:20:48.072042 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.073235 master-0 kubenswrapper[7518]: E0319 09:20:48.072124 7518 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 09:20:48.073235 master-0 kubenswrapper[7518]: E0319 09:20:48.072169 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert podName:7214416f-03b4-4507-918b-ca3c0c95773e nodeName:}" failed. No retries permitted until 2026-03-19 09:20:56.072155934 +0000 UTC m=+73.954739193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert") pod "route-controller-manager-58559b7f6c-j4rrt" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e") : secret "serving-cert" not found Mar 19 09:20:48.073235 master-0 kubenswrapper[7518]: I0319 09:20:48.073132 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db42b38e-294e-4016-8ac1-54126ac60de8-cache\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.073430 master-0 kubenswrapper[7518]: I0319 09:20:48.073274 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.082665 master-0 kubenswrapper[7518]: I0319 09:20:48.082620 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.149679 master-0 kubenswrapper[7518]: I0319 09:20:48.149616 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:20:48.173796 master-0 kubenswrapper[7518]: I0319 09:20:48.173035 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.173796 master-0 kubenswrapper[7518]: I0319 09:20:48.173103 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.173796 master-0 kubenswrapper[7518]: I0319 09:20:48.173141 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1dd59466-0133-41fe-a648-28db73aa861b-cache\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.173796 master-0 kubenswrapper[7518]: I0319 09:20:48.173156 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.173796 master-0 kubenswrapper[7518]: I0319 09:20:48.173179 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.173796 master-0 kubenswrapper[7518]: I0319 09:20:48.173212 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzntq\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-kube-api-access-gzntq\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.174315 master-0 kubenswrapper[7518]: I0319 09:20:48.174277 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-fdxtp"] Mar 19 09:20:48.177770 master-0 kubenswrapper[7518]: I0319 09:20:48.177705 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6"] Mar 19 09:20:48.179630 master-0 kubenswrapper[7518]: I0319 09:20:48.179492 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx"] Mar 19 09:20:48.197447 master-0 kubenswrapper[7518]: I0319 09:20:48.197383 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-66c44d7ccf-z4ssv"] Mar 19 09:20:48.197447 master-0 kubenswrapper[7518]: I0319 09:20:48.197434 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dwx6\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-kube-api-access-8dwx6\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.240205 master-0 kubenswrapper[7518]: I0319 09:20:48.240108 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.302865 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1dd59466-0133-41fe-a648-28db73aa861b-cache\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.302913 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.302939 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.302975 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzntq\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-kube-api-access-gzntq\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.303005 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.303048 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.303123 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.303633 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1dd59466-0133-41fe-a648-28db73aa861b-cache\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: E0319 09:20:48.303731 7518 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: E0319 09:20:48.303776 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs podName:1dd59466-0133-41fe-a648-28db73aa861b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:48.803762411 +0000 UTC m=+66.686345670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-7wdws" (UID: "1dd59466-0133-41fe-a648-28db73aa861b") : secret "catalogserver-cert" not found Mar 19 09:20:48.305882 master-0 kubenswrapper[7518]: I0319 09:20:48.304072 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.312747 master-0 kubenswrapper[7518]: I0319 09:20:48.312702 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.322560 master-0 kubenswrapper[7518]: I0319 09:20:48.322196 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bc6cad-d4ba-4d22-b9c9-117c91de19a1" path="/var/lib/kubelet/pods/f6bc6cad-d4ba-4d22-b9c9-117c91de19a1/volumes" Mar 19 09:20:48.396219 master-0 kubenswrapper[7518]: I0319 09:20:48.396157 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzntq\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-kube-api-access-gzntq\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.475384 master-0 kubenswrapper[7518]: I0319 09:20:48.475280 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:48.495907 master-0 kubenswrapper[7518]: I0319 09:20:48.493125 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg"] Mar 19 09:20:48.729190 master-0 kubenswrapper[7518]: I0319 09:20:48.729150 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerStarted","Data":"78ed7df2de04c4d9012bf3b0bae0730cc7f525024f23a27fe0e47c32e46b41f6"} Mar 19 09:20:48.730258 master-0 kubenswrapper[7518]: I0319 09:20:48.730207 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" event={"ID":"33e92e5d-61ea-45b2-b357-ebffdaebf4af","Type":"ContainerStarted","Data":"50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500"} Mar 19 09:20:48.731606 master-0 kubenswrapper[7518]: I0319 09:20:48.731549 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerStarted","Data":"3580cc8aaddf6d9ceec4e9655520a84a1d14647aea74906c068b15c17cd230e2"} Mar 19 09:20:48.733056 master-0 kubenswrapper[7518]: I0319 09:20:48.733007 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" event={"ID":"ece5177b-ae15-4c33-a8d4-612ab50b2b8b","Type":"ContainerStarted","Data":"f7e0d1fae2c29d1550044dbfbc303fe4f5bb6dc47066c479df51113017952abe"} Mar 19 09:20:48.734280 master-0 kubenswrapper[7518]: I0319 09:20:48.734256 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8edda930-b012-4f1f-977a-a71ef8763fe3","Type":"ContainerStarted","Data":"b2515e8d783e89d55f85d85bd5d14ced809801e3538f9daafbc170ce9d11b9e0"} Mar 19 09:20:48.735348 master-0 kubenswrapper[7518]: I0319 09:20:48.735326 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" event={"ID":"a417fe25-4aca-471c-941d-c195b6141042","Type":"ContainerStarted","Data":"477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991"} Mar 19 09:20:48.737214 master-0 kubenswrapper[7518]: I0319 09:20:48.737137 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" event={"ID":"9ac42112-6a00-4c17-b230-75b565aa668f","Type":"ContainerStarted","Data":"9f7751b6243f5b55d5db7507e92a7214e3b051f064f66c13d1a6b5d546c577a0"} Mar 19 09:20:48.813393 master-0 kubenswrapper[7518]: I0319 09:20:48.813343 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:48.813600 master-0 kubenswrapper[7518]: E0319 09:20:48.813559 7518 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 09:20:48.813660 master-0 kubenswrapper[7518]: E0319 09:20:48.813631 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs podName:1dd59466-0133-41fe-a648-28db73aa861b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:49.813614553 +0000 UTC m=+67.696197802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-7wdws" (UID: "1dd59466-0133-41fe-a648-28db73aa861b") : secret "catalogserver-cert" not found Mar 19 09:20:48.915057 master-0 kubenswrapper[7518]: I0319 09:20:48.914727 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert\") pod \"apiserver-7558b877c5-pb68b\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:48.915057 master-0 kubenswrapper[7518]: E0319 09:20:48.915018 7518 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 09:20:48.915266 master-0 kubenswrapper[7518]: E0319 09:20:48.915098 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert podName:cabb0e91-c3ad-4142-8834-06fd8d55c0b7 nodeName:}" failed. No retries permitted until 2026-03-19 09:20:50.915073876 +0000 UTC m=+68.797657145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert") pod "apiserver-7558b877c5-pb68b" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7") : secret "serving-cert" not found Mar 19 09:20:49.482579 master-0 kubenswrapper[7518]: I0319 09:20:49.481797 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-7558b877c5-pb68b"] Mar 19 09:20:49.482579 master-0 kubenswrapper[7518]: E0319 09:20:49.482514 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" podUID="cabb0e91-c3ad-4142-8834-06fd8d55c0b7" Mar 19 09:20:49.745616 master-0 kubenswrapper[7518]: I0319 09:20:49.745507 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8edda930-b012-4f1f-977a-a71ef8763fe3","Type":"ContainerStarted","Data":"bfdd507a44c29b0cf95c9bc532b3b91ef64c10d4c5165041299ded0e08cc28ac"} Mar 19 09:20:49.753204 master-0 kubenswrapper[7518]: I0319 09:20:49.753122 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:49.753411 master-0 kubenswrapper[7518]: I0319 09:20:49.753179 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerStarted","Data":"46c2569238ab51925b376aaf70b8ba157122b93aded3cd51d5f8d5c316256bd1"} Mar 19 09:20:49.753529 master-0 kubenswrapper[7518]: I0319 09:20:49.753421 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:49.753529 master-0 kubenswrapper[7518]: I0319 09:20:49.753434 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerStarted","Data":"35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50"} Mar 19 09:20:49.765425 master-0 kubenswrapper[7518]: I0319 09:20:49.765252 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:49.833371 master-0 kubenswrapper[7518]: I0319 09:20:49.833321 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:49.833584 master-0 kubenswrapper[7518]: E0319 09:20:49.833511 7518 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 09:20:49.833584 master-0 kubenswrapper[7518]: E0319 09:20:49.833575 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs podName:1dd59466-0133-41fe-a648-28db73aa861b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:51.833558246 +0000 UTC m=+69.716141505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-7wdws" (UID: "1dd59466-0133-41fe-a648-28db73aa861b") : secret "catalogserver-cert" not found Mar 19 09:20:49.900645 master-0 kubenswrapper[7518]: I0319 09:20:49.898222 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=2.898204253 podStartE2EDuration="2.898204253s" podCreationTimestamp="2026-03-19 09:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:49.898042708 +0000 UTC m=+67.780625967" watchObservedRunningTime="2026-03-19 09:20:49.898204253 +0000 UTC m=+67.780787512" Mar 19 09:20:49.928478 master-0 kubenswrapper[7518]: I0319 09:20:49.928314 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" podStartSLOduration=2.928296941 podStartE2EDuration="2.928296941s" podCreationTimestamp="2026-03-19 09:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:20:49.927838496 +0000 UTC m=+67.810421775" watchObservedRunningTime="2026-03-19 09:20:49.928296941 +0000 UTC m=+67.810880200" Mar 19 09:20:49.935588 master-0 kubenswrapper[7518]: I0319 09:20:49.935550 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-encryption-config\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.935588 master-0 kubenswrapper[7518]: I0319 09:20:49.935602 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-client\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.935819 master-0 kubenswrapper[7518]: I0319 09:20:49.935655 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-policies\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.935819 master-0 kubenswrapper[7518]: I0319 09:20:49.935699 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-dir\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.935819 master-0 kubenswrapper[7518]: I0319 09:20:49.935739 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-serving-ca\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.935819 master-0 kubenswrapper[7518]: I0319 09:20:49.935771 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5trwr\" (UniqueName: \"kubernetes.io/projected/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-kube-api-access-5trwr\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.935819 master-0 kubenswrapper[7518]: I0319 09:20:49.935794 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-trusted-ca-bundle\") pod \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\" (UID: \"cabb0e91-c3ad-4142-8834-06fd8d55c0b7\") " Mar 19 09:20:49.937925 master-0 kubenswrapper[7518]: I0319 09:20:49.937756 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:20:49.938412 master-0 kubenswrapper[7518]: I0319 09:20:49.938383 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:49.939489 master-0 kubenswrapper[7518]: I0319 09:20:49.939187 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:49.941235 master-0 kubenswrapper[7518]: I0319 09:20:49.940161 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:49.947630 master-0 kubenswrapper[7518]: I0319 09:20:49.947416 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:49.955827 master-0 kubenswrapper[7518]: I0319 09:20:49.955745 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:49.966048 master-0 kubenswrapper[7518]: I0319 09:20:49.964953 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-kube-api-access-5trwr" (OuterVolumeSpecName: "kube-api-access-5trwr") pod "cabb0e91-c3ad-4142-8834-06fd8d55c0b7" (UID: "cabb0e91-c3ad-4142-8834-06fd8d55c0b7"). InnerVolumeSpecName "kube-api-access-5trwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039016 7518 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039058 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039074 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5trwr\" (UniqueName: \"kubernetes.io/projected/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-kube-api-access-5trwr\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039086 7518 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039097 7518 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039110 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.040649 master-0 kubenswrapper[7518]: I0319 09:20:50.039123 7518 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:50.757048 master-0 kubenswrapper[7518]: I0319 09:20:50.756977 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7558b877c5-pb68b" Mar 19 09:20:50.815453 master-0 kubenswrapper[7518]: I0319 09:20:50.812940 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-7558b877c5-pb68b"] Mar 19 09:20:50.820184 master-0 kubenswrapper[7518]: I0319 09:20:50.820102 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2"] Mar 19 09:20:50.821194 master-0 kubenswrapper[7518]: I0319 09:20:50.821161 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.829232 master-0 kubenswrapper[7518]: I0319 09:20:50.829162 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.829570 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.829916 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.830095 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.830274 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.830561 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.830707 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:20:50.831335 master-0 kubenswrapper[7518]: I0319 09:20:50.830827 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:20:50.833578 master-0 kubenswrapper[7518]: I0319 09:20:50.833532 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-7558b877c5-pb68b"] Mar 19 09:20:50.834702 master-0 kubenswrapper[7518]: I0319 09:20:50.834657 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2"] Mar 19 09:20:50.954330 master-0 kubenswrapper[7518]: I0319 09:20:50.954245 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-client\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954565 master-0 kubenswrapper[7518]: I0319 09:20:50.954348 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-serving-ca\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954565 master-0 kubenswrapper[7518]: I0319 09:20:50.954420 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-policies\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954565 master-0 kubenswrapper[7518]: I0319 09:20:50.954488 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-trusted-ca-bundle\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954700 master-0 kubenswrapper[7518]: I0319 09:20:50.954567 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-dir\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954700 master-0 kubenswrapper[7518]: I0319 09:20:50.954620 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-encryption-config\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954776 master-0 kubenswrapper[7518]: I0319 09:20:50.954696 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-serving-cert\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954816 master-0 kubenswrapper[7518]: I0319 09:20:50.954772 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2ntw\" (UniqueName: \"kubernetes.io/projected/b2bff8a5-c45d-4d28-8771-2239ad0fa578-kube-api-access-s2ntw\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:50.954870 master-0 kubenswrapper[7518]: I0319 09:20:50.954848 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cabb0e91-c3ad-4142-8834-06fd8d55c0b7-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:51.055887 master-0 kubenswrapper[7518]: I0319 09:20:51.055745 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-client\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.055887 master-0 kubenswrapper[7518]: I0319 09:20:51.055805 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-serving-ca\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.055887 master-0 kubenswrapper[7518]: I0319 09:20:51.055821 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-policies\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.055887 master-0 kubenswrapper[7518]: I0319 09:20:51.055841 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-trusted-ca-bundle\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.055887 master-0 kubenswrapper[7518]: I0319 09:20:51.055865 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-dir\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.055887 master-0 kubenswrapper[7518]: I0319 09:20:51.055885 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-encryption-config\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.056308 master-0 kubenswrapper[7518]: I0319 09:20:51.055917 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-serving-cert\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.056308 master-0 kubenswrapper[7518]: I0319 09:20:51.055947 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2ntw\" (UniqueName: \"kubernetes.io/projected/b2bff8a5-c45d-4d28-8771-2239ad0fa578-kube-api-access-s2ntw\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.057675 master-0 kubenswrapper[7518]: I0319 09:20:51.057159 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-dir\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.057675 master-0 kubenswrapper[7518]: I0319 09:20:51.057215 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-trusted-ca-bundle\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.057675 master-0 kubenswrapper[7518]: I0319 09:20:51.057579 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-policies\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.057675 master-0 kubenswrapper[7518]: I0319 09:20:51.057608 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-serving-ca\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.060160 master-0 kubenswrapper[7518]: I0319 09:20:51.060095 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-encryption-config\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.060522 master-0 kubenswrapper[7518]: I0319 09:20:51.060453 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-client\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.062925 master-0 kubenswrapper[7518]: I0319 09:20:51.062880 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-serving-cert\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.075543 master-0 kubenswrapper[7518]: I0319 09:20:51.075479 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2ntw\" (UniqueName: \"kubernetes.io/projected/b2bff8a5-c45d-4d28-8771-2239ad0fa578-kube-api-access-s2ntw\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.171606 master-0 kubenswrapper[7518]: I0319 09:20:51.171524 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:20:51.863750 master-0 kubenswrapper[7518]: I0319 09:20:51.863696 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:51.864195 master-0 kubenswrapper[7518]: E0319 09:20:51.863932 7518 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 19 09:20:51.864195 master-0 kubenswrapper[7518]: E0319 09:20:51.864030 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs podName:1dd59466-0133-41fe-a648-28db73aa861b nodeName:}" failed. No retries permitted until 2026-03-19 09:20:55.864009401 +0000 UTC m=+73.746592660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs") pod "catalogd-controller-manager-6864dc98f7-7wdws" (UID: "1dd59466-0133-41fe-a648-28db73aa861b") : secret "catalogserver-cert" not found Mar 19 09:20:52.322207 master-0 kubenswrapper[7518]: I0319 09:20:52.322148 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabb0e91-c3ad-4142-8834-06fd8d55c0b7" path="/var/lib/kubelet/pods/cabb0e91-c3ad-4142-8834-06fd8d55c0b7/volumes" Mar 19 09:20:53.769010 master-0 kubenswrapper[7518]: I0319 09:20:53.768705 7518 generic.go:334] "Generic (PLEG): container finished" podID="310d604b-fe9a-4b19-b8b5-7a1983e45e67" containerID="f349a28ea0bb985b97d809f46b60d5c4412444c67eeb0389e91efb0430bb6dcb" exitCode=0 Mar 19 09:20:53.769010 master-0 kubenswrapper[7518]: I0319 09:20:53.768997 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" event={"ID":"310d604b-fe9a-4b19-b8b5-7a1983e45e67","Type":"ContainerDied","Data":"f349a28ea0bb985b97d809f46b60d5c4412444c67eeb0389e91efb0430bb6dcb"} Mar 19 09:20:53.769794 master-0 kubenswrapper[7518]: I0319 09:20:53.769359 7518 scope.go:117] "RemoveContainer" containerID="f349a28ea0bb985b97d809f46b60d5c4412444c67eeb0389e91efb0430bb6dcb" Mar 19 09:20:55.918888 master-0 kubenswrapper[7518]: I0319 09:20:55.918811 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:55.923656 master-0 kubenswrapper[7518]: I0319 09:20:55.923612 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:56.121007 master-0 kubenswrapper[7518]: I0319 09:20:56.120867 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:56.125692 master-0 kubenswrapper[7518]: I0319 09:20:56.125451 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"route-controller-manager-58559b7f6c-j4rrt\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:56.163258 master-0 kubenswrapper[7518]: I0319 09:20:56.163201 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:20:56.401872 master-0 kubenswrapper[7518]: I0319 09:20:56.401751 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:20:57.169713 master-0 kubenswrapper[7518]: I0319 09:20:57.169103 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:20:57.170229 master-0 kubenswrapper[7518]: I0319 09:20:57.170059 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="8edda930-b012-4f1f-977a-a71ef8763fe3" containerName="installer" containerID="cri-o://bfdd507a44c29b0cf95c9bc532b3b91ef64c10d4c5165041299ded0e08cc28ac" gracePeriod=30 Mar 19 09:20:58.007191 master-0 kubenswrapper[7518]: I0319 09:20:58.000408 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65dbf9584-tg7x7"] Mar 19 09:20:58.007191 master-0 kubenswrapper[7518]: I0319 09:20:58.006397 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" podUID="cf60b652-41e7-492a-a1f1-d6b2f9680f67" containerName="controller-manager" containerID="cri-o://4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3" gracePeriod=30 Mar 19 09:20:58.041347 master-0 kubenswrapper[7518]: I0319 09:20:58.041297 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt"] Mar 19 09:20:58.247597 master-0 kubenswrapper[7518]: I0319 09:20:58.246580 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:20:58.389466 master-0 kubenswrapper[7518]: I0319 09:20:58.389407 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2"] Mar 19 09:20:58.392293 master-0 kubenswrapper[7518]: I0319 09:20:58.392251 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt"] Mar 19 09:20:58.392803 master-0 kubenswrapper[7518]: I0319 09:20:58.392565 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws"] Mar 19 09:20:58.694350 master-0 kubenswrapper[7518]: I0319 09:20:58.694010 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:58.783261 master-0 kubenswrapper[7518]: I0319 09:20:58.782687 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-proxy-ca-bundles\") pod \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " Mar 19 09:20:58.783261 master-0 kubenswrapper[7518]: I0319 09:20:58.782740 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9ksf\" (UniqueName: \"kubernetes.io/projected/cf60b652-41e7-492a-a1f1-d6b2f9680f67-kube-api-access-g9ksf\") pod \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " Mar 19 09:20:58.783261 master-0 kubenswrapper[7518]: I0319 09:20:58.782768 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-client-ca\") pod \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " Mar 19 09:20:58.783261 master-0 kubenswrapper[7518]: I0319 09:20:58.782795 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-config\") pod \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " Mar 19 09:20:58.783261 master-0 kubenswrapper[7518]: I0319 09:20:58.782820 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf60b652-41e7-492a-a1f1-d6b2f9680f67-serving-cert\") pod \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\" (UID: \"cf60b652-41e7-492a-a1f1-d6b2f9680f67\") " Mar 19 09:20:58.783721 master-0 kubenswrapper[7518]: I0319 09:20:58.783511 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf60b652-41e7-492a-a1f1-d6b2f9680f67" (UID: "cf60b652-41e7-492a-a1f1-d6b2f9680f67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:58.783721 master-0 kubenswrapper[7518]: I0319 09:20:58.783524 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cf60b652-41e7-492a-a1f1-d6b2f9680f67" (UID: "cf60b652-41e7-492a-a1f1-d6b2f9680f67"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:58.783721 master-0 kubenswrapper[7518]: I0319 09:20:58.783545 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-config" (OuterVolumeSpecName: "config") pod "cf60b652-41e7-492a-a1f1-d6b2f9680f67" (UID: "cf60b652-41e7-492a-a1f1-d6b2f9680f67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:20:58.790677 master-0 kubenswrapper[7518]: I0319 09:20:58.790302 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf60b652-41e7-492a-a1f1-d6b2f9680f67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf60b652-41e7-492a-a1f1-d6b2f9680f67" (UID: "cf60b652-41e7-492a-a1f1-d6b2f9680f67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:20:58.790823 master-0 kubenswrapper[7518]: I0319 09:20:58.790706 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf60b652-41e7-492a-a1f1-d6b2f9680f67-kube-api-access-g9ksf" (OuterVolumeSpecName: "kube-api-access-g9ksf") pod "cf60b652-41e7-492a-a1f1-d6b2f9680f67" (UID: "cf60b652-41e7-492a-a1f1-d6b2f9680f67"). InnerVolumeSpecName "kube-api-access-g9ksf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:20:58.796363 master-0 kubenswrapper[7518]: I0319 09:20:58.796310 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" event={"ID":"37533d4d-1eed-4f61-853e-4536958bf13a","Type":"ContainerStarted","Data":"105a6a40133824d2e007836aefb615c5d194273bd0bbd901bcc6063dee6bd3fb"} Mar 19 09:20:58.804256 master-0 kubenswrapper[7518]: I0319 09:20:58.803027 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" event={"ID":"a417fe25-4aca-471c-941d-c195b6141042","Type":"ContainerStarted","Data":"c9962dbaaef7e57c9640c3178a747f94e808e39adbc57a0ff6abea54d49966c4"} Mar 19 09:20:58.808037 master-0 kubenswrapper[7518]: I0319 09:20:58.807954 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" event={"ID":"33e92e5d-61ea-45b2-b357-ebffdaebf4af","Type":"ContainerStarted","Data":"e567b2a6970dbbdd6d360830a8ee46fec46945b28639df21bdc4828de4e3065b"} Mar 19 09:20:58.808967 master-0 kubenswrapper[7518]: I0319 09:20:58.808940 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:20:58.812568 master-0 kubenswrapper[7518]: I0319 09:20:58.812500 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:20:58.816231 master-0 kubenswrapper[7518]: I0319 09:20:58.815843 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerStarted","Data":"d43b2cecb46ee4d7282d2377662b9eb7bab83399567784e4db2c8496f2616648"} Mar 19 09:20:58.819441 master-0 kubenswrapper[7518]: I0319 09:20:58.819326 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" event={"ID":"ece5177b-ae15-4c33-a8d4-612ab50b2b8b","Type":"ContainerStarted","Data":"6fb399b5a6bd7bca5adfd7320a0c00aaa9e16fc36388387d71d3d22de37fda4a"} Mar 19 09:20:58.823354 master-0 kubenswrapper[7518]: I0319 09:20:58.823289 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" event={"ID":"1dd59466-0133-41fe-a648-28db73aa861b","Type":"ContainerStarted","Data":"88dd8210417d34cd695549010f86bdfe2541add1af48e0e0b07c7ed8f524f103"} Mar 19 09:20:58.824544 master-0 kubenswrapper[7518]: I0319 09:20:58.824404 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" event={"ID":"b2bff8a5-c45d-4d28-8771-2239ad0fa578","Type":"ContainerStarted","Data":"15eda3bde3926ace98dc82fe5b6fb4d1ace5d01b315a5e6ece92e5b50ae9132e"} Mar 19 09:20:58.825715 master-0 kubenswrapper[7518]: I0319 09:20:58.825689 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" event={"ID":"7214416f-03b4-4507-918b-ca3c0c95773e","Type":"ContainerStarted","Data":"4a22ef1476d8d21d51a15c1fd47011040d8afa763f696b2a13ec917bbfbd6be8"} Mar 19 09:20:58.829219 master-0 kubenswrapper[7518]: I0319 09:20:58.828874 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" event={"ID":"310d604b-fe9a-4b19-b8b5-7a1983e45e67","Type":"ContainerStarted","Data":"cff56a454aada979d268c9f501837771f3a09461de2ce1b57ec395bc1b538aae"} Mar 19 09:20:58.831013 master-0 kubenswrapper[7518]: I0319 09:20:58.830969 7518 generic.go:334] "Generic (PLEG): container finished" podID="cf60b652-41e7-492a-a1f1-d6b2f9680f67" containerID="4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3" exitCode=0 Mar 19 09:20:58.831071 master-0 kubenswrapper[7518]: I0319 09:20:58.831026 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" Mar 19 09:20:58.831071 master-0 kubenswrapper[7518]: I0319 09:20:58.831058 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" event={"ID":"cf60b652-41e7-492a-a1f1-d6b2f9680f67","Type":"ContainerDied","Data":"4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3"} Mar 19 09:20:58.831172 master-0 kubenswrapper[7518]: I0319 09:20:58.831093 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65dbf9584-tg7x7" event={"ID":"cf60b652-41e7-492a-a1f1-d6b2f9680f67","Type":"ContainerDied","Data":"88434a59a7308bc36e38b535d7d9d2585acc58eac032cef32588d420be3ca90a"} Mar 19 09:20:58.831211 master-0 kubenswrapper[7518]: I0319 09:20:58.831169 7518 scope.go:117] "RemoveContainer" containerID="4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3" Mar 19 09:20:58.833218 master-0 kubenswrapper[7518]: I0319 09:20:58.833174 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" event={"ID":"32b1ae47-ef83-448d-b40d-a836cb6c6fc0","Type":"ContainerStarted","Data":"c7e68cb3271256a9333d55ffab578a3758ec1fbf9021fe986d32592d8ec62834"} Mar 19 09:20:58.856938 master-0 kubenswrapper[7518]: I0319 09:20:58.856877 7518 scope.go:117] "RemoveContainer" containerID="4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3" Mar 19 09:20:58.857589 master-0 kubenswrapper[7518]: E0319 09:20:58.857541 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3\": container with ID starting with 4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3 not found: ID does not exist" containerID="4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3" Mar 19 09:20:58.857771 master-0 kubenswrapper[7518]: I0319 09:20:58.857604 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3"} err="failed to get container status \"4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3\": rpc error: code = NotFound desc = could not find container \"4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3\": container with ID starting with 4d985685b8f116eb3ebee42e27084d89c4dfac93e0d465722a52a581e84ba0d3 not found: ID does not exist" Mar 19 09:20:58.886076 master-0 kubenswrapper[7518]: I0319 09:20:58.886033 7518 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:58.886267 master-0 kubenswrapper[7518]: I0319 09:20:58.886098 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9ksf\" (UniqueName: \"kubernetes.io/projected/cf60b652-41e7-492a-a1f1-d6b2f9680f67-kube-api-access-g9ksf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:58.886267 master-0 kubenswrapper[7518]: I0319 09:20:58.886110 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:58.886267 master-0 kubenswrapper[7518]: I0319 09:20:58.886119 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60b652-41e7-492a-a1f1-d6b2f9680f67-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:58.886267 master-0 kubenswrapper[7518]: I0319 09:20:58.886129 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf60b652-41e7-492a-a1f1-d6b2f9680f67-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:20:59.837905 master-0 kubenswrapper[7518]: I0319 09:20:59.837843 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" event={"ID":"1dd59466-0133-41fe-a648-28db73aa861b","Type":"ContainerStarted","Data":"d9a207a034373d18840850084ca3af4f71d5a110c1cc9d64cde96ab1b2db955e"} Mar 19 09:20:59.839468 master-0 kubenswrapper[7518]: I0319 09:20:59.839122 7518 generic.go:334] "Generic (PLEG): container finished" podID="37533d4d-1eed-4f61-853e-4536958bf13a" containerID="105a6a40133824d2e007836aefb615c5d194273bd0bbd901bcc6063dee6bd3fb" exitCode=0 Mar 19 09:20:59.839540 master-0 kubenswrapper[7518]: I0319 09:20:59.839169 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" event={"ID":"37533d4d-1eed-4f61-853e-4536958bf13a","Type":"ContainerDied","Data":"105a6a40133824d2e007836aefb615c5d194273bd0bbd901bcc6063dee6bd3fb"} Mar 19 09:20:59.840718 master-0 kubenswrapper[7518]: I0319 09:20:59.840687 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" event={"ID":"9ac42112-6a00-4c17-b230-75b565aa668f","Type":"ContainerStarted","Data":"411b807ef5e4b2a4c4cb5b0d7895136bfe753e0f79cfceb1fc1e64b7b6417d07"} Mar 19 09:20:59.843052 master-0 kubenswrapper[7518]: I0319 09:20:59.843023 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerStarted","Data":"899cf44d089ff0f2dec617eb293dbf32b1dc1c098dd4a10862dea763e6544b07"} Mar 19 09:20:59.845101 master-0 kubenswrapper[7518]: I0319 09:20:59.845069 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" event={"ID":"ece5177b-ae15-4c33-a8d4-612ab50b2b8b","Type":"ContainerStarted","Data":"6645214728a162ebee91aa7420fa188f7c83660da77255a0937c499b5d812049"} Mar 19 09:21:00.049639 master-0 kubenswrapper[7518]: I0319 09:21:00.046965 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: I0319 09:21:00.091957 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fcf878b4-mjm86"] Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: E0319 09:21:00.092189 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37533d4d-1eed-4f61-853e-4536958bf13a" containerName="fix-audit-permissions" Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: I0319 09:21:00.092202 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="37533d4d-1eed-4f61-853e-4536958bf13a" containerName="fix-audit-permissions" Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: E0319 09:21:00.092220 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf60b652-41e7-492a-a1f1-d6b2f9680f67" containerName="controller-manager" Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: I0319 09:21:00.092228 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf60b652-41e7-492a-a1f1-d6b2f9680f67" containerName="controller-manager" Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: I0319 09:21:00.092300 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf60b652-41e7-492a-a1f1-d6b2f9680f67" containerName="controller-manager" Mar 19 09:21:00.092573 master-0 kubenswrapper[7518]: I0319 09:21:00.092325 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="37533d4d-1eed-4f61-853e-4536958bf13a" containerName="fix-audit-permissions" Mar 19 09:21:00.093506 master-0 kubenswrapper[7518]: I0319 09:21:00.093101 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:00.093506 master-0 kubenswrapper[7518]: I0319 09:21:00.093458 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.093988 master-0 kubenswrapper[7518]: I0319 09:21:00.093808 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.103311 master-0 kubenswrapper[7518]: I0319 09:21:00.101689 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:21:00.103311 master-0 kubenswrapper[7518]: I0319 09:21:00.101983 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:21:00.103311 master-0 kubenswrapper[7518]: I0319 09:21:00.102177 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:21:00.103311 master-0 kubenswrapper[7518]: I0319 09:21:00.102313 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:21:00.103311 master-0 kubenswrapper[7518]: I0319 09:21:00.102439 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:00.111938 master-0 kubenswrapper[7518]: I0319 09:21:00.111890 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:21:00.112136 master-0 kubenswrapper[7518]: I0319 09:21:00.112083 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:00.113167 master-0 kubenswrapper[7518]: I0319 09:21:00.113122 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fcf878b4-mjm86"] Mar 19 09:21:00.206877 master-0 kubenswrapper[7518]: I0319 09:21:00.206824 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-audit-dir\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.206892 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-config\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.206917 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-serving-ca\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.206951 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-audit\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.206973 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-client\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.207003 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-trusted-ca-bundle\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.207033 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-image-import-ca\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.207057 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-serving-cert\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.207085 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-node-pullsecrets\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207134 master-0 kubenswrapper[7518]: I0319 09:21:00.207110 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-encryption-config\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207160 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnbxp\" (UniqueName: \"kubernetes.io/projected/37533d4d-1eed-4f61-853e-4536958bf13a-kube-api-access-bnbxp\") pod \"37533d4d-1eed-4f61-853e-4536958bf13a\" (UID: \"37533d4d-1eed-4f61-853e-4536958bf13a\") " Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207273 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-var-lock\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207307 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de27a71b-4736-46c2-8de7-d409fa52685d-serving-cert\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207335 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207366 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-client-ca\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207489 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-proxy-ca-bundles\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207513 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e70442bc-7032-4700-9b0b-9f71acce25ad-kube-api-access\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207554 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltpk\" (UniqueName: \"kubernetes.io/projected/de27a71b-4736-46c2-8de7-d409fa52685d-kube-api-access-jltpk\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.207636 master-0 kubenswrapper[7518]: I0319 09:21:00.207635 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-config\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.207957 master-0 kubenswrapper[7518]: I0319 09:21:00.207709 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:00.207957 master-0 kubenswrapper[7518]: I0319 09:21:00.207806 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:00.208405 master-0 kubenswrapper[7518]: I0319 09:21:00.208326 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-config" (OuterVolumeSpecName: "config") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:00.208685 master-0 kubenswrapper[7518]: I0319 09:21:00.208664 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:00.209412 master-0 kubenswrapper[7518]: I0319 09:21:00.209364 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:00.209663 master-0 kubenswrapper[7518]: I0319 09:21:00.209623 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:00.211287 master-0 kubenswrapper[7518]: I0319 09:21:00.211012 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-audit" (OuterVolumeSpecName: "audit") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:00.216697 master-0 kubenswrapper[7518]: I0319 09:21:00.216609 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37533d4d-1eed-4f61-853e-4536958bf13a-kube-api-access-bnbxp" (OuterVolumeSpecName: "kube-api-access-bnbxp") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "kube-api-access-bnbxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:00.216697 master-0 kubenswrapper[7518]: I0319 09:21:00.216667 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:00.216954 master-0 kubenswrapper[7518]: I0319 09:21:00.216640 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:00.216954 master-0 kubenswrapper[7518]: I0319 09:21:00.216757 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "37533d4d-1eed-4f61-853e-4536958bf13a" (UID: "37533d4d-1eed-4f61-853e-4536958bf13a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:00.308632 master-0 kubenswrapper[7518]: I0319 09:21:00.308565 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jltpk\" (UniqueName: \"kubernetes.io/projected/de27a71b-4736-46c2-8de7-d409fa52685d-kube-api-access-jltpk\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.308963 master-0 kubenswrapper[7518]: I0319 09:21:00.308897 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-config\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.309209 master-0 kubenswrapper[7518]: I0319 09:21:00.309160 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-var-lock\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.310595 master-0 kubenswrapper[7518]: I0319 09:21:00.310556 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-var-lock\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.310643 master-0 kubenswrapper[7518]: I0319 09:21:00.310626 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de27a71b-4736-46c2-8de7-d409fa52685d-serving-cert\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.310673 master-0 kubenswrapper[7518]: I0319 09:21:00.310653 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.310716 master-0 kubenswrapper[7518]: I0319 09:21:00.310690 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-client-ca\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.310752 master-0 kubenswrapper[7518]: I0319 09:21:00.310738 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-proxy-ca-bundles\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.310783 master-0 kubenswrapper[7518]: I0319 09:21:00.310767 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e70442bc-7032-4700-9b0b-9f71acce25ad-kube-api-access\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.310889 master-0 kubenswrapper[7518]: I0319 09:21:00.310861 7518 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310889 master-0 kubenswrapper[7518]: I0319 09:21:00.310887 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnbxp\" (UniqueName: \"kubernetes.io/projected/37533d4d-1eed-4f61-853e-4536958bf13a-kube-api-access-bnbxp\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310898 7518 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310909 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310918 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310927 7518 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-audit\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310935 7518 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310945 7518 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.310951 master-0 kubenswrapper[7518]: I0319 09:21:00.310955 7518 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37533d4d-1eed-4f61-853e-4536958bf13a-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.311156 master-0 kubenswrapper[7518]: I0319 09:21:00.310965 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37533d4d-1eed-4f61-853e-4536958bf13a-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.311156 master-0 kubenswrapper[7518]: I0319 09:21:00.310974 7518 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37533d4d-1eed-4f61-853e-4536958bf13a-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:00.311156 master-0 kubenswrapper[7518]: I0319 09:21:00.311052 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-config\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.311156 master-0 kubenswrapper[7518]: I0319 09:21:00.311130 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.311976 master-0 kubenswrapper[7518]: I0319 09:21:00.311939 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-client-ca\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.313074 master-0 kubenswrapper[7518]: I0319 09:21:00.313037 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-proxy-ca-bundles\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.315841 master-0 kubenswrapper[7518]: I0319 09:21:00.315794 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de27a71b-4736-46c2-8de7-d409fa52685d-serving-cert\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.427689 master-0 kubenswrapper[7518]: I0319 09:21:00.427462 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltpk\" (UniqueName: \"kubernetes.io/projected/de27a71b-4736-46c2-8de7-d409fa52685d-kube-api-access-jltpk\") pod \"controller-manager-7fcf878b4-mjm86\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.442501 master-0 kubenswrapper[7518]: I0319 09:21:00.440259 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e70442bc-7032-4700-9b0b-9f71acce25ad-kube-api-access\") pod \"installer-2-master-0\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.450397 master-0 kubenswrapper[7518]: I0319 09:21:00.450348 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:00.473591 master-0 kubenswrapper[7518]: I0319 09:21:00.471705 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:00.872820 master-0 kubenswrapper[7518]: I0319 09:21:00.872613 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" event={"ID":"1dd59466-0133-41fe-a648-28db73aa861b","Type":"ContainerStarted","Data":"a1f85bd022ed8d1a8a116afb5f7497547553a16a5ec3238e8ae6d26b7095a795"} Mar 19 09:21:00.872820 master-0 kubenswrapper[7518]: I0319 09:21:00.872710 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:21:00.876039 master-0 kubenswrapper[7518]: I0319 09:21:00.875427 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" Mar 19 09:21:00.876511 master-0 kubenswrapper[7518]: I0319 09:21:00.875652 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66c44d7ccf-z4ssv" event={"ID":"37533d4d-1eed-4f61-853e-4536958bf13a","Type":"ContainerDied","Data":"7cff4d9dd2b2d25ee0021fa3a61334c6c54bbb2fd78b56037cdd3dc4bff919b6"} Mar 19 09:21:00.876575 master-0 kubenswrapper[7518]: I0319 09:21:00.876533 7518 scope.go:117] "RemoveContainer" containerID="105a6a40133824d2e007836aefb615c5d194273bd0bbd901bcc6063dee6bd3fb" Mar 19 09:21:00.922016 master-0 kubenswrapper[7518]: I0319 09:21:00.921828 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65dbf9584-tg7x7"] Mar 19 09:21:01.039623 master-0 kubenswrapper[7518]: I0319 09:21:01.039548 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:01.039809 master-0 kubenswrapper[7518]: I0319 09:21:01.039647 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fcf878b4-mjm86"] Mar 19 09:21:01.042659 master-0 kubenswrapper[7518]: I0319 09:21:01.040434 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65dbf9584-tg7x7"] Mar 19 09:21:01.049655 master-0 kubenswrapper[7518]: W0319 09:21:01.048435 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde27a71b_4736_46c2_8de7_d409fa52685d.slice/crio-d1a947a6e4cfe0689bee11c725d7351d0259fe7b72181ce7fe0ec6b785ca7c59 WatchSource:0}: Error finding container d1a947a6e4cfe0689bee11c725d7351d0259fe7b72181ce7fe0ec6b785ca7c59: Status 404 returned error can't find the container with id d1a947a6e4cfe0689bee11c725d7351d0259fe7b72181ce7fe0ec6b785ca7c59 Mar 19 09:21:01.063845 master-0 kubenswrapper[7518]: W0319 09:21:01.056284 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode70442bc_7032_4700_9b0b_9f71acce25ad.slice/crio-f8ac3b71437154c471149874ab3e7f3a947283e3919139d78279c41d7256c32d WatchSource:0}: Error finding container f8ac3b71437154c471149874ab3e7f3a947283e3919139d78279c41d7256c32d: Status 404 returned error can't find the container with id f8ac3b71437154c471149874ab3e7f3a947283e3919139d78279c41d7256c32d Mar 19 09:21:01.132315 master-0 kubenswrapper[7518]: I0319 09:21:01.131670 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-vkw4s"] Mar 19 09:21:01.143508 master-0 kubenswrapper[7518]: I0319 09:21:01.135030 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235526 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysconfig\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235605 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-var-lib-kubelet\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235645 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-kubernetes\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235717 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfsx\" (UniqueName: \"kubernetes.io/projected/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-kube-api-access-rnfsx\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235751 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-modprobe-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235782 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235834 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-run\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235861 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-tmp\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235902 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-tuned\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235954 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-conf\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.235984 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-systemd\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.236013 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-lib-modules\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.236040 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-host\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.237649 master-0 kubenswrapper[7518]: I0319 09:21:01.236062 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-sys\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337012 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-modprobe-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337066 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfsx\" (UniqueName: \"kubernetes.io/projected/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-kube-api-access-rnfsx\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337097 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337143 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-run\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337163 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-tmp\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337207 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-tuned\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337242 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-conf\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337268 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-systemd\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337294 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-lib-modules\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337317 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-sys\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337338 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-host\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337367 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysconfig\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337388 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-var-lib-kubelet\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337416 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-kubernetes\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337626 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-kubernetes\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.337722 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-modprobe-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.338889 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339603 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-systemd\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339654 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-host\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339691 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-sys\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339684 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-conf\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339699 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-lib-modules\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339745 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysconfig\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.339792 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-var-lib-kubelet\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.340767 master-0 kubenswrapper[7518]: I0319 09:21:01.340600 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-run\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.344925 master-0 kubenswrapper[7518]: I0319 09:21:01.344890 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-tmp\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.346404 master-0 kubenswrapper[7518]: I0319 09:21:01.346272 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-tuned\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.377991 master-0 kubenswrapper[7518]: I0319 09:21:01.377912 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfsx\" (UniqueName: \"kubernetes.io/projected/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-kube-api-access-rnfsx\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.396255 master-0 kubenswrapper[7518]: I0319 09:21:01.396106 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-66c44d7ccf-z4ssv"] Mar 19 09:21:01.478613 master-0 kubenswrapper[7518]: I0319 09:21:01.475105 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-66c44d7ccf-z4ssv"] Mar 19 09:21:01.567184 master-0 kubenswrapper[7518]: I0319 09:21:01.562741 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:21:01.898141 master-0 kubenswrapper[7518]: I0319 09:21:01.897926 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p88qq"] Mar 19 09:21:01.898950 master-0 kubenswrapper[7518]: I0319 09:21:01.898770 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:01.906375 master-0 kubenswrapper[7518]: I0319 09:21:01.905359 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" event={"ID":"de27a71b-4736-46c2-8de7-d409fa52685d","Type":"ContainerStarted","Data":"61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e"} Mar 19 09:21:01.906375 master-0 kubenswrapper[7518]: I0319 09:21:01.905413 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" event={"ID":"de27a71b-4736-46c2-8de7-d409fa52685d","Type":"ContainerStarted","Data":"d1a947a6e4cfe0689bee11c725d7351d0259fe7b72181ce7fe0ec6b785ca7c59"} Mar 19 09:21:01.906375 master-0 kubenswrapper[7518]: I0319 09:21:01.906339 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:01.912054 master-0 kubenswrapper[7518]: W0319 09:21:01.912011 7518 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'master-0' and this object Mar 19 09:21:01.912294 master-0 kubenswrapper[7518]: E0319 09:21:01.912267 7518 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:21:01.912418 master-0 kubenswrapper[7518]: I0319 09:21:01.912018 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:21:01.912720 master-0 kubenswrapper[7518]: W0319 09:21:01.912102 7518 reflector.go:561] object-"openshift-dns"/"dns-default": failed to list *v1.ConfigMap: configmaps "dns-default" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'master-0' and this object Mar 19 09:21:01.912825 master-0 kubenswrapper[7518]: E0319 09:21:01.912807 7518 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"dns-default\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"dns-default\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:21:01.915617 master-0 kubenswrapper[7518]: I0319 09:21:01.915580 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"e70442bc-7032-4700-9b0b-9f71acce25ad","Type":"ContainerStarted","Data":"cedcc447f352ca925d6be4191ca4c2529ff3315af9ffcd9eb08176f813b434f8"} Mar 19 09:21:01.915810 master-0 kubenswrapper[7518]: I0319 09:21:01.915796 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"e70442bc-7032-4700-9b0b-9f71acce25ad","Type":"ContainerStarted","Data":"f8ac3b71437154c471149874ab3e7f3a947283e3919139d78279c41d7256c32d"} Mar 19 09:21:01.921203 master-0 kubenswrapper[7518]: I0319 09:21:01.921165 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:21:01.927938 master-0 kubenswrapper[7518]: I0319 09:21:01.927793 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:01.969585 master-0 kubenswrapper[7518]: I0319 09:21:01.969527 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p88qq"] Mar 19 09:21:01.977088 master-0 kubenswrapper[7518]: I0319 09:21:01.977017 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" podStartSLOduration=14.97699165 podStartE2EDuration="14.97699165s" podCreationTimestamp="2026-03-19 09:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:01.96901539 +0000 UTC m=+79.851598659" watchObservedRunningTime="2026-03-19 09:21:01.97699165 +0000 UTC m=+79.859574919" Mar 19 09:21:02.055315 master-0 kubenswrapper[7518]: I0319 09:21:02.055235 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmdlx\" (UniqueName: \"kubernetes.io/projected/b8f39c16-3a94-45c3-a51c-f2e81eff967d-kube-api-access-qmdlx\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.055582 master-0 kubenswrapper[7518]: I0319 09:21:02.055413 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.055582 master-0 kubenswrapper[7518]: I0319 09:21:02.055445 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8f39c16-3a94-45c3-a51c-f2e81eff967d-config-volume\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.156281 master-0 kubenswrapper[7518]: I0319 09:21:02.156143 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.156281 master-0 kubenswrapper[7518]: I0319 09:21:02.156211 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8f39c16-3a94-45c3-a51c-f2e81eff967d-config-volume\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.156281 master-0 kubenswrapper[7518]: I0319 09:21:02.156232 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdlx\" (UniqueName: \"kubernetes.io/projected/b8f39c16-3a94-45c3-a51c-f2e81eff967d-kube-api-access-qmdlx\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.156651 master-0 kubenswrapper[7518]: E0319 09:21:02.156540 7518 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 19 09:21:02.156651 master-0 kubenswrapper[7518]: E0319 09:21:02.156577 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls podName:b8f39c16-3a94-45c3-a51c-f2e81eff967d nodeName:}" failed. No retries permitted until 2026-03-19 09:21:02.656565196 +0000 UTC m=+80.539148455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls") pod "dns-default-p88qq" (UID: "b8f39c16-3a94-45c3-a51c-f2e81eff967d") : secret "dns-default-metrics-tls" not found Mar 19 09:21:02.306241 master-0 kubenswrapper[7518]: I0319 09:21:02.305411 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" podStartSLOduration=4.305391782 podStartE2EDuration="4.305391782s" podCreationTimestamp="2026-03-19 09:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:02.304868224 +0000 UTC m=+80.187451483" watchObservedRunningTime="2026-03-19 09:21:02.305391782 +0000 UTC m=+80.187975041" Mar 19 09:21:02.327556 master-0 kubenswrapper[7518]: I0319 09:21:02.327389 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37533d4d-1eed-4f61-853e-4536958bf13a" path="/var/lib/kubelet/pods/37533d4d-1eed-4f61-853e-4536958bf13a/volumes" Mar 19 09:21:02.328204 master-0 kubenswrapper[7518]: I0319 09:21:02.328171 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf60b652-41e7-492a-a1f1-d6b2f9680f67" path="/var/lib/kubelet/pods/cf60b652-41e7-492a-a1f1-d6b2f9680f67/volumes" Mar 19 09:21:02.381429 master-0 kubenswrapper[7518]: I0319 09:21:02.381362 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=3.381336411 podStartE2EDuration="3.381336411s" podCreationTimestamp="2026-03-19 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:02.381160865 +0000 UTC m=+80.263744134" watchObservedRunningTime="2026-03-19 09:21:02.381336411 +0000 UTC m=+80.263919690" Mar 19 09:21:02.666273 master-0 kubenswrapper[7518]: I0319 09:21:02.666137 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.677016 master-0 kubenswrapper[7518]: I0319 09:21:02.676952 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:02.908322 master-0 kubenswrapper[7518]: I0319 09:21:02.908259 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:21:02.915591 master-0 kubenswrapper[7518]: I0319 09:21:02.915529 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdlx\" (UniqueName: \"kubernetes.io/projected/b8f39c16-3a94-45c3-a51c-f2e81eff967d-kube-api-access-qmdlx\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:03.000893 master-0 kubenswrapper[7518]: I0319 09:21:03.000789 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:21:03.007565 master-0 kubenswrapper[7518]: I0319 09:21:03.007499 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8f39c16-3a94-45c3-a51c-f2e81eff967d-config-volume\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:03.137322 master-0 kubenswrapper[7518]: I0319 09:21:03.137243 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:03.216376 master-0 kubenswrapper[7518]: I0319 09:21:03.216327 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-54cd8888b9-q4ztg"] Mar 19 09:21:03.217352 master-0 kubenswrapper[7518]: I0319 09:21:03.217334 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.221131 master-0 kubenswrapper[7518]: I0319 09:21:03.221085 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:21:03.221323 master-0 kubenswrapper[7518]: I0319 09:21:03.221301 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:21:03.221512 master-0 kubenswrapper[7518]: I0319 09:21:03.221495 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:21:03.222431 master-0 kubenswrapper[7518]: I0319 09:21:03.221750 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:21:03.225943 master-0 kubenswrapper[7518]: I0319 09:21:03.225438 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:21:03.225943 master-0 kubenswrapper[7518]: I0319 09:21:03.225923 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:21:03.226882 master-0 kubenswrapper[7518]: I0319 09:21:03.226846 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:21:03.227062 master-0 kubenswrapper[7518]: I0319 09:21:03.227036 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:21:03.227233 master-0 kubenswrapper[7518]: I0319 09:21:03.227210 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:21:03.238821 master-0 kubenswrapper[7518]: I0319 09:21:03.238590 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:21:03.288747 master-0 kubenswrapper[7518]: I0319 09:21:03.288534 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-54cd8888b9-q4ztg"] Mar 19 09:21:03.372963 master-0 kubenswrapper[7518]: I0319 09:21:03.372864 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-client\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.372963 master-0 kubenswrapper[7518]: I0319 09:21:03.372967 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-encryption-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.373349 master-0 kubenswrapper[7518]: I0319 09:21:03.373033 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-serving-cert\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.374115 master-0 kubenswrapper[7518]: I0319 09:21:03.374081 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.374186 master-0 kubenswrapper[7518]: I0319 09:21:03.374116 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-node-pullsecrets\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.374186 master-0 kubenswrapper[7518]: I0319 09:21:03.374145 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr9dj\" (UniqueName: \"kubernetes.io/projected/3a4fd337-c385-4f56-965c-d68ee0a4e848-kube-api-access-vr9dj\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.374605 master-0 kubenswrapper[7518]: I0319 09:21:03.374436 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit-dir\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.374827 master-0 kubenswrapper[7518]: I0319 09:21:03.374786 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-trusted-ca-bundle\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.374949 master-0 kubenswrapper[7518]: I0319 09:21:03.374924 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-serving-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.375086 master-0 kubenswrapper[7518]: I0319 09:21:03.375068 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-image-import-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.375169 master-0 kubenswrapper[7518]: I0319 09:21:03.375155 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.475951 master-0 kubenswrapper[7518]: I0319 09:21:03.475899 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit-dir\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.477091 master-0 kubenswrapper[7518]: I0319 09:21:03.476066 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit-dir\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.477593 master-0 kubenswrapper[7518]: I0319 09:21:03.477198 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-trusted-ca-bundle\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.477984 master-0 kubenswrapper[7518]: I0319 09:21:03.477932 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-serving-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.478178 master-0 kubenswrapper[7518]: I0319 09:21:03.478163 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-image-import-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.478580 master-0 kubenswrapper[7518]: I0319 09:21:03.478562 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.478778 master-0 kubenswrapper[7518]: I0319 09:21:03.478761 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-client\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.478865 master-0 kubenswrapper[7518]: I0319 09:21:03.478852 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-encryption-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.478975 master-0 kubenswrapper[7518]: I0319 09:21:03.478954 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-serving-cert\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479066 master-0 kubenswrapper[7518]: I0319 09:21:03.479053 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479320 master-0 kubenswrapper[7518]: I0319 09:21:03.479282 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-image-import-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479397 master-0 kubenswrapper[7518]: I0319 09:21:03.478813 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-serving-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479481 master-0 kubenswrapper[7518]: I0319 09:21:03.479279 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-node-pullsecrets\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479572 master-0 kubenswrapper[7518]: I0319 09:21:03.479518 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479633 master-0 kubenswrapper[7518]: I0319 09:21:03.478277 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-trusted-ca-bundle\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.479736 master-0 kubenswrapper[7518]: I0319 09:21:03.479720 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9dj\" (UniqueName: \"kubernetes.io/projected/3a4fd337-c385-4f56-965c-d68ee0a4e848-kube-api-access-vr9dj\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.480037 master-0 kubenswrapper[7518]: I0319 09:21:03.479939 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.480037 master-0 kubenswrapper[7518]: I0319 09:21:03.480002 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-node-pullsecrets\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.482628 master-0 kubenswrapper[7518]: I0319 09:21:03.482357 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-client\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.484651 master-0 kubenswrapper[7518]: I0319 09:21:03.484609 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-serving-cert\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.485910 master-0 kubenswrapper[7518]: I0319 09:21:03.485892 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-encryption-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.520218 master-0 kubenswrapper[7518]: I0319 09:21:03.520129 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9dj\" (UniqueName: \"kubernetes.io/projected/3a4fd337-c385-4f56-965c-d68ee0a4e848-kube-api-access-vr9dj\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.542191 master-0 kubenswrapper[7518]: I0319 09:21:03.542020 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:03.953758 master-0 kubenswrapper[7518]: I0319 09:21:03.953673 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pmxm8"] Mar 19 09:21:03.955004 master-0 kubenswrapper[7518]: I0319 09:21:03.954418 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:03.987915 master-0 kubenswrapper[7518]: I0319 09:21:03.987854 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52fa1ad-0071-4506-bb94-e73d6f15a75c-hosts-file\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:03.988189 master-0 kubenswrapper[7518]: I0319 09:21:03.987962 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvg4q\" (UniqueName: \"kubernetes.io/projected/d52fa1ad-0071-4506-bb94-e73d6f15a75c-kube-api-access-xvg4q\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:04.088882 master-0 kubenswrapper[7518]: I0319 09:21:04.088711 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52fa1ad-0071-4506-bb94-e73d6f15a75c-hosts-file\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:04.088882 master-0 kubenswrapper[7518]: I0319 09:21:04.088811 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvg4q\" (UniqueName: \"kubernetes.io/projected/d52fa1ad-0071-4506-bb94-e73d6f15a75c-kube-api-access-xvg4q\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:04.088882 master-0 kubenswrapper[7518]: I0319 09:21:04.088856 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52fa1ad-0071-4506-bb94-e73d6f15a75c-hosts-file\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:04.111205 master-0 kubenswrapper[7518]: I0319 09:21:04.110187 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvg4q\" (UniqueName: \"kubernetes.io/projected/d52fa1ad-0071-4506-bb94-e73d6f15a75c-kube-api-access-xvg4q\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:04.123517 master-0 kubenswrapper[7518]: I0319 09:21:04.123086 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 09:21:04.129095 master-0 kubenswrapper[7518]: I0319 09:21:04.128853 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.131370 master-0 kubenswrapper[7518]: I0319 09:21:04.129841 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 09:21:04.131370 master-0 kubenswrapper[7518]: I0319 09:21:04.130643 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 09:21:04.271779 master-0 kubenswrapper[7518]: I0319 09:21:04.270013 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-54cd8888b9-q4ztg"] Mar 19 09:21:04.274826 master-0 kubenswrapper[7518]: I0319 09:21:04.274126 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:21:04.300544 master-0 kubenswrapper[7518]: I0319 09:21:04.299728 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.300544 master-0 kubenswrapper[7518]: I0319 09:21:04.299811 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/259aa9cc-51a9-498e-b099-ba4d781801c5-kube-api-access\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.300544 master-0 kubenswrapper[7518]: I0319 09:21:04.299890 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-var-lock\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.306852 master-0 kubenswrapper[7518]: W0319 09:21:04.306801 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a4fd337_c385_4f56_965c_d68ee0a4e848.slice/crio-bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741 WatchSource:0}: Error finding container bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741: Status 404 returned error can't find the container with id bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741 Mar 19 09:21:04.314679 master-0 kubenswrapper[7518]: I0319 09:21:04.314633 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p88qq"] Mar 19 09:21:04.322252 master-0 kubenswrapper[7518]: W0319 09:21:04.321857 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52fa1ad_0071_4506_bb94_e73d6f15a75c.slice/crio-80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7 WatchSource:0}: Error finding container 80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7: Status 404 returned error can't find the container with id 80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7 Mar 19 09:21:04.401682 master-0 kubenswrapper[7518]: I0319 09:21:04.401513 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-var-lock\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.401744 master-0 kubenswrapper[7518]: I0319 09:21:04.401711 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.401789 master-0 kubenswrapper[7518]: I0319 09:21:04.401769 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/259aa9cc-51a9-498e-b099-ba4d781801c5-kube-api-access\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.406575 master-0 kubenswrapper[7518]: I0319 09:21:04.401921 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-var-lock\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.406575 master-0 kubenswrapper[7518]: I0319 09:21:04.402424 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.430134 master-0 kubenswrapper[7518]: I0319 09:21:04.430049 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/259aa9cc-51a9-498e-b099-ba4d781801c5-kube-api-access\") pod \"installer-1-master-0\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.468538 master-0 kubenswrapper[7518]: I0319 09:21:04.468065 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:04.696831 master-0 kubenswrapper[7518]: I0319 09:21:04.696335 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 09:21:04.934859 master-0 kubenswrapper[7518]: I0319 09:21:04.934806 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" event={"ID":"7214416f-03b4-4507-918b-ca3c0c95773e","Type":"ContainerStarted","Data":"76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618"} Mar 19 09:21:04.935129 master-0 kubenswrapper[7518]: I0319 09:21:04.934887 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" podUID="7214416f-03b4-4507-918b-ca3c0c95773e" containerName="route-controller-manager" containerID="cri-o://76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618" gracePeriod=30 Mar 19 09:21:04.935176 master-0 kubenswrapper[7518]: I0319 09:21:04.935166 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:21:04.944896 master-0 kubenswrapper[7518]: I0319 09:21:04.941966 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:21:04.944896 master-0 kubenswrapper[7518]: I0319 09:21:04.942043 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p88qq" event={"ID":"b8f39c16-3a94-45c3-a51c-f2e81eff967d","Type":"ContainerStarted","Data":"2ebe3fb9cab9178261c34fb487eaacac7fa326d405ced605571d043522371ecf"} Mar 19 09:21:04.949499 master-0 kubenswrapper[7518]: I0319 09:21:04.946955 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" event={"ID":"7fda0d28-6511-4577-9cd3-58a9c1a64d4e","Type":"ContainerStarted","Data":"66963603be080a8fd8ea25a6bca5e0cd067eb8bd4f03080fd14c359ca91ec696"} Mar 19 09:21:04.954373 master-0 kubenswrapper[7518]: I0319 09:21:04.954283 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" event={"ID":"7fda0d28-6511-4577-9cd3-58a9c1a64d4e","Type":"ContainerStarted","Data":"28e3c243e9aa17a8b785c259d586a2532c2c1b1ce191ff462d38340601511000"} Mar 19 09:21:04.959578 master-0 kubenswrapper[7518]: I0319 09:21:04.958310 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"259aa9cc-51a9-498e-b099-ba4d781801c5","Type":"ContainerStarted","Data":"e5c5cd2c130a06e83a755f581cc3a20c2c3dce618468e51c158559ad4071da8b"} Mar 19 09:21:04.959810 master-0 kubenswrapper[7518]: I0319 09:21:04.959781 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pmxm8" event={"ID":"d52fa1ad-0071-4506-bb94-e73d6f15a75c","Type":"ContainerStarted","Data":"52c972c8bb3bccbc755d962e203a0df51bce0728337b9ac6d8fe50087b16a579"} Mar 19 09:21:04.959810 master-0 kubenswrapper[7518]: I0319 09:21:04.959805 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pmxm8" event={"ID":"d52fa1ad-0071-4506-bb94-e73d6f15a75c","Type":"ContainerStarted","Data":"80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7"} Mar 19 09:21:04.962185 master-0 kubenswrapper[7518]: I0319 09:21:04.962156 7518 generic.go:334] "Generic (PLEG): container finished" podID="b2bff8a5-c45d-4d28-8771-2239ad0fa578" containerID="a48adae5f84d07444bee0a5da7af010f18bdba5c7270d3b00d241369bd585daa" exitCode=0 Mar 19 09:21:04.962265 master-0 kubenswrapper[7518]: I0319 09:21:04.962212 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" event={"ID":"b2bff8a5-c45d-4d28-8771-2239ad0fa578","Type":"ContainerDied","Data":"a48adae5f84d07444bee0a5da7af010f18bdba5c7270d3b00d241369bd585daa"} Mar 19 09:21:04.968267 master-0 kubenswrapper[7518]: I0319 09:21:04.968139 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" podStartSLOduration=21.42184415 podStartE2EDuration="26.968117712s" podCreationTimestamp="2026-03-19 09:20:38 +0000 UTC" firstStartedPulling="2026-03-19 09:20:58.490984352 +0000 UTC m=+76.373567611" lastFinishedPulling="2026-03-19 09:21:04.037257914 +0000 UTC m=+81.919841173" observedRunningTime="2026-03-19 09:21:04.966118774 +0000 UTC m=+82.848702053" watchObservedRunningTime="2026-03-19 09:21:04.968117712 +0000 UTC m=+82.850700971" Mar 19 09:21:04.970351 master-0 kubenswrapper[7518]: I0319 09:21:04.970313 7518 generic.go:334] "Generic (PLEG): container finished" podID="3a4fd337-c385-4f56-965c-d68ee0a4e848" containerID="87f01015e01422976c49ff53ddbd24b82fcd8498b6ca2d45f75d0b8a77fa808e" exitCode=0 Mar 19 09:21:04.970413 master-0 kubenswrapper[7518]: I0319 09:21:04.970356 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" event={"ID":"3a4fd337-c385-4f56-965c-d68ee0a4e848","Type":"ContainerDied","Data":"87f01015e01422976c49ff53ddbd24b82fcd8498b6ca2d45f75d0b8a77fa808e"} Mar 19 09:21:04.970413 master-0 kubenswrapper[7518]: I0319 09:21:04.970380 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" event={"ID":"3a4fd337-c385-4f56-965c-d68ee0a4e848","Type":"ContainerStarted","Data":"bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741"} Mar 19 09:21:05.018570 master-0 kubenswrapper[7518]: I0319 09:21:05.018078 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pmxm8" podStartSLOduration=2.0180529209999998 podStartE2EDuration="2.018052921s" podCreationTimestamp="2026-03-19 09:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:05.016948265 +0000 UTC m=+82.899531524" watchObservedRunningTime="2026-03-19 09:21:05.018052921 +0000 UTC m=+82.900636180" Mar 19 09:21:05.057983 master-0 kubenswrapper[7518]: I0319 09:21:05.056608 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" podStartSLOduration=5.056586185 podStartE2EDuration="5.056586185s" podCreationTimestamp="2026-03-19 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:05.054235846 +0000 UTC m=+82.936819105" watchObservedRunningTime="2026-03-19 09:21:05.056586185 +0000 UTC m=+82.939169444" Mar 19 09:21:05.301888 master-0 kubenswrapper[7518]: I0319 09:21:05.301838 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:21:05.425843 master-0 kubenswrapper[7518]: I0319 09:21:05.425795 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") pod \"7214416f-03b4-4507-918b-ca3c0c95773e\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " Mar 19 09:21:05.426027 master-0 kubenswrapper[7518]: I0319 09:21:05.425882 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6chx\" (UniqueName: \"kubernetes.io/projected/7214416f-03b4-4507-918b-ca3c0c95773e-kube-api-access-m6chx\") pod \"7214416f-03b4-4507-918b-ca3c0c95773e\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " Mar 19 09:21:05.426027 master-0 kubenswrapper[7518]: I0319 09:21:05.425945 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-config\") pod \"7214416f-03b4-4507-918b-ca3c0c95773e\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " Mar 19 09:21:05.426027 master-0 kubenswrapper[7518]: I0319 09:21:05.425965 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-client-ca\") pod \"7214416f-03b4-4507-918b-ca3c0c95773e\" (UID: \"7214416f-03b4-4507-918b-ca3c0c95773e\") " Mar 19 09:21:05.427046 master-0 kubenswrapper[7518]: I0319 09:21:05.426819 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7214416f-03b4-4507-918b-ca3c0c95773e" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:05.427361 master-0 kubenswrapper[7518]: I0319 09:21:05.427304 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-config" (OuterVolumeSpecName: "config") pod "7214416f-03b4-4507-918b-ca3c0c95773e" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:05.440641 master-0 kubenswrapper[7518]: I0319 09:21:05.434871 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7214416f-03b4-4507-918b-ca3c0c95773e" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:05.440641 master-0 kubenswrapper[7518]: I0319 09:21:05.436098 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7214416f-03b4-4507-918b-ca3c0c95773e-kube-api-access-m6chx" (OuterVolumeSpecName: "kube-api-access-m6chx") pod "7214416f-03b4-4507-918b-ca3c0c95773e" (UID: "7214416f-03b4-4507-918b-ca3c0c95773e"). InnerVolumeSpecName "kube-api-access-m6chx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:05.526878 master-0 kubenswrapper[7518]: I0319 09:21:05.526829 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6chx\" (UniqueName: \"kubernetes.io/projected/7214416f-03b4-4507-918b-ca3c0c95773e-kube-api-access-m6chx\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:05.526878 master-0 kubenswrapper[7518]: I0319 09:21:05.526867 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:05.526878 master-0 kubenswrapper[7518]: I0319 09:21:05.526881 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7214416f-03b4-4507-918b-ca3c0c95773e-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:05.526878 master-0 kubenswrapper[7518]: I0319 09:21:05.526892 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7214416f-03b4-4507-918b-ca3c0c95773e-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:05.977583 master-0 kubenswrapper[7518]: I0319 09:21:05.976509 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"259aa9cc-51a9-498e-b099-ba4d781801c5","Type":"ContainerStarted","Data":"89a2fc8df576416ddd348c57ed4c730f0abfa16882e2a3cc4358c65c4a9606ca"} Mar 19 09:21:05.982926 master-0 kubenswrapper[7518]: I0319 09:21:05.982863 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" event={"ID":"b2bff8a5-c45d-4d28-8771-2239ad0fa578","Type":"ContainerStarted","Data":"6fcdb194133a77437bb49a40433b1455d1694735bdee82a4502dcdac52139a93"} Mar 19 09:21:05.985709 master-0 kubenswrapper[7518]: I0319 09:21:05.985662 7518 generic.go:334] "Generic (PLEG): container finished" podID="7214416f-03b4-4507-918b-ca3c0c95773e" containerID="76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618" exitCode=0 Mar 19 09:21:05.985824 master-0 kubenswrapper[7518]: I0319 09:21:05.985786 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" Mar 19 09:21:05.985824 master-0 kubenswrapper[7518]: I0319 09:21:05.985812 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" event={"ID":"7214416f-03b4-4507-918b-ca3c0c95773e","Type":"ContainerDied","Data":"76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618"} Mar 19 09:21:05.985897 master-0 kubenswrapper[7518]: I0319 09:21:05.985848 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt" event={"ID":"7214416f-03b4-4507-918b-ca3c0c95773e","Type":"ContainerDied","Data":"4a22ef1476d8d21d51a15c1fd47011040d8afa763f696b2a13ec917bbfbd6be8"} Mar 19 09:21:05.985897 master-0 kubenswrapper[7518]: I0319 09:21:05.985894 7518 scope.go:117] "RemoveContainer" containerID="76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618" Mar 19 09:21:05.990496 master-0 kubenswrapper[7518]: I0319 09:21:05.990442 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" event={"ID":"3a4fd337-c385-4f56-965c-d68ee0a4e848","Type":"ContainerStarted","Data":"7478081e6f04e118b13640c2042213bc8cf7285644dd4ae5670d1033f9b67814"} Mar 19 09:21:05.991313 master-0 kubenswrapper[7518]: I0319 09:21:05.990642 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" event={"ID":"3a4fd337-c385-4f56-965c-d68ee0a4e848","Type":"ContainerStarted","Data":"a895c1eb3445ebf5ccbeac108d101891f84dadae1420bce0d3f295dfaaf68f6c"} Mar 19 09:21:06.000719 master-0 kubenswrapper[7518]: I0319 09:21:05.999852 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=1.9998225330000001 podStartE2EDuration="1.999822533s" podCreationTimestamp="2026-03-19 09:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:05.995173735 +0000 UTC m=+83.877757004" watchObservedRunningTime="2026-03-19 09:21:05.999822533 +0000 UTC m=+83.882405792" Mar 19 09:21:06.011148 master-0 kubenswrapper[7518]: I0319 09:21:06.011037 7518 scope.go:117] "RemoveContainer" containerID="76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618" Mar 19 09:21:06.011744 master-0 kubenswrapper[7518]: E0319 09:21:06.011698 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618\": container with ID starting with 76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618 not found: ID does not exist" containerID="76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618" Mar 19 09:21:06.011796 master-0 kubenswrapper[7518]: I0319 09:21:06.011750 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618"} err="failed to get container status \"76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618\": rpc error: code = NotFound desc = could not find container \"76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618\": container with ID starting with 76b921dec3e411447f59401c69f8d7c5a063d1c0b89d6157ccccc56561753618 not found: ID does not exist" Mar 19 09:21:06.029360 master-0 kubenswrapper[7518]: I0319 09:21:06.027948 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" podStartSLOduration=11.499812626 podStartE2EDuration="17.027929214s" podCreationTimestamp="2026-03-19 09:20:49 +0000 UTC" firstStartedPulling="2026-03-19 09:20:58.490988752 +0000 UTC m=+76.373572001" lastFinishedPulling="2026-03-19 09:21:04.01910533 +0000 UTC m=+81.901688589" observedRunningTime="2026-03-19 09:21:06.02694381 +0000 UTC m=+83.909527069" watchObservedRunningTime="2026-03-19 09:21:06.027929214 +0000 UTC m=+83.910512473" Mar 19 09:21:06.048381 master-0 kubenswrapper[7518]: I0319 09:21:06.048309 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" podStartSLOduration=18.048294432 podStartE2EDuration="18.048294432s" podCreationTimestamp="2026-03-19 09:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:06.047563228 +0000 UTC m=+83.930146497" watchObservedRunningTime="2026-03-19 09:21:06.048294432 +0000 UTC m=+83.930877691" Mar 19 09:21:06.061329 master-0 kubenswrapper[7518]: I0319 09:21:06.061280 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt"] Mar 19 09:21:06.077495 master-0 kubenswrapper[7518]: I0319 09:21:06.077023 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58559b7f6c-j4rrt"] Mar 19 09:21:06.167842 master-0 kubenswrapper[7518]: I0319 09:21:06.167790 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:21:06.172036 master-0 kubenswrapper[7518]: I0319 09:21:06.171953 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:21:06.172453 master-0 kubenswrapper[7518]: I0319 09:21:06.172416 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:21:06.181746 master-0 kubenswrapper[7518]: I0319 09:21:06.181710 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:21:06.183834 master-0 kubenswrapper[7518]: I0319 09:21:06.183807 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5"] Mar 19 09:21:06.184009 master-0 kubenswrapper[7518]: E0319 09:21:06.183987 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7214416f-03b4-4507-918b-ca3c0c95773e" containerName="route-controller-manager" Mar 19 09:21:06.184009 master-0 kubenswrapper[7518]: I0319 09:21:06.184005 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="7214416f-03b4-4507-918b-ca3c0c95773e" containerName="route-controller-manager" Mar 19 09:21:06.184099 master-0 kubenswrapper[7518]: I0319 09:21:06.184082 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="7214416f-03b4-4507-918b-ca3c0c95773e" containerName="route-controller-manager" Mar 19 09:21:06.184399 master-0 kubenswrapper[7518]: I0319 09:21:06.184379 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.193052 master-0 kubenswrapper[7518]: I0319 09:21:06.192999 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:21:06.193194 master-0 kubenswrapper[7518]: I0319 09:21:06.193169 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:21:06.193255 master-0 kubenswrapper[7518]: I0319 09:21:06.193021 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:21:06.193458 master-0 kubenswrapper[7518]: I0319 09:21:06.193430 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:06.193651 master-0 kubenswrapper[7518]: I0319 09:21:06.193620 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:21:06.197864 master-0 kubenswrapper[7518]: I0319 09:21:06.197801 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5"] Mar 19 09:21:06.329073 master-0 kubenswrapper[7518]: I0319 09:21:06.328942 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7214416f-03b4-4507-918b-ca3c0c95773e" path="/var/lib/kubelet/pods/7214416f-03b4-4507-918b-ca3c0c95773e/volumes" Mar 19 09:21:06.336955 master-0 kubenswrapper[7518]: I0319 09:21:06.336679 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-config\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.336955 master-0 kubenswrapper[7518]: I0319 09:21:06.336825 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clr2t\" (UniqueName: \"kubernetes.io/projected/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-kube-api-access-clr2t\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.336955 master-0 kubenswrapper[7518]: I0319 09:21:06.336892 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-serving-cert\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.336955 master-0 kubenswrapper[7518]: I0319 09:21:06.336917 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-client-ca\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.444592 master-0 kubenswrapper[7518]: I0319 09:21:06.444512 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clr2t\" (UniqueName: \"kubernetes.io/projected/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-kube-api-access-clr2t\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.444803 master-0 kubenswrapper[7518]: I0319 09:21:06.444622 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-serving-cert\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.444803 master-0 kubenswrapper[7518]: I0319 09:21:06.444652 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-client-ca\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.444803 master-0 kubenswrapper[7518]: I0319 09:21:06.444708 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-config\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.445992 master-0 kubenswrapper[7518]: I0319 09:21:06.445966 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-config\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.459463 master-0 kubenswrapper[7518]: I0319 09:21:06.459422 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-client-ca\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.477591 master-0 kubenswrapper[7518]: I0319 09:21:06.473512 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-serving-cert\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.486014 master-0 kubenswrapper[7518]: I0319 09:21:06.485971 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clr2t\" (UniqueName: \"kubernetes.io/projected/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-kube-api-access-clr2t\") pod \"route-controller-manager-686585f447-gm2z5\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:06.527699 master-0 kubenswrapper[7518]: I0319 09:21:06.527642 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:07.000364 master-0 kubenswrapper[7518]: I0319 09:21:07.000291 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:21:08.542625 master-0 kubenswrapper[7518]: I0319 09:21:08.542513 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:08.542625 master-0 kubenswrapper[7518]: I0319 09:21:08.542609 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:09.419510 master-0 kubenswrapper[7518]: I0319 09:21:09.415160 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:09.419510 master-0 kubenswrapper[7518]: I0319 09:21:09.415453 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="e70442bc-7032-4700-9b0b-9f71acce25ad" containerName="installer" containerID="cri-o://cedcc447f352ca925d6be4191ca4c2529ff3315af9ffcd9eb08176f813b434f8" gracePeriod=30 Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: I0319 09:21:09.877134 7518 patch_prober.go:28] interesting pod/apiserver-54cd8888b9-q4ztg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]log ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]etcd ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/max-in-flight-filter ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/openshift.io-startinformers ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: livez check failed Mar 19 09:21:09.877391 master-0 kubenswrapper[7518]: I0319 09:21:09.877205 7518 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" podUID="3a4fd337-c385-4f56-965c-d68ee0a4e848" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:21:10.011947 master-0 kubenswrapper[7518]: I0319 09:21:10.011878 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_e70442bc-7032-4700-9b0b-9f71acce25ad/installer/0.log" Mar 19 09:21:10.012224 master-0 kubenswrapper[7518]: I0319 09:21:10.011977 7518 generic.go:334] "Generic (PLEG): container finished" podID="e70442bc-7032-4700-9b0b-9f71acce25ad" containerID="cedcc447f352ca925d6be4191ca4c2529ff3315af9ffcd9eb08176f813b434f8" exitCode=1 Mar 19 09:21:10.012224 master-0 kubenswrapper[7518]: I0319 09:21:10.012040 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"e70442bc-7032-4700-9b0b-9f71acce25ad","Type":"ContainerDied","Data":"cedcc447f352ca925d6be4191ca4c2529ff3315af9ffcd9eb08176f813b434f8"} Mar 19 09:21:10.475620 master-0 kubenswrapper[7518]: I0319 09:21:10.475027 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_e70442bc-7032-4700-9b0b-9f71acce25ad/installer/0.log" Mar 19 09:21:10.475620 master-0 kubenswrapper[7518]: I0319 09:21:10.475119 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:10.510567 master-0 kubenswrapper[7518]: I0319 09:21:10.510487 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e70442bc-7032-4700-9b0b-9f71acce25ad-kube-api-access\") pod \"e70442bc-7032-4700-9b0b-9f71acce25ad\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " Mar 19 09:21:10.510567 master-0 kubenswrapper[7518]: I0319 09:21:10.510557 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-kubelet-dir\") pod \"e70442bc-7032-4700-9b0b-9f71acce25ad\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " Mar 19 09:21:10.510867 master-0 kubenswrapper[7518]: I0319 09:21:10.510606 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-var-lock\") pod \"e70442bc-7032-4700-9b0b-9f71acce25ad\" (UID: \"e70442bc-7032-4700-9b0b-9f71acce25ad\") " Mar 19 09:21:10.510867 master-0 kubenswrapper[7518]: I0319 09:21:10.510766 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-var-lock" (OuterVolumeSpecName: "var-lock") pod "e70442bc-7032-4700-9b0b-9f71acce25ad" (UID: "e70442bc-7032-4700-9b0b-9f71acce25ad"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:10.510867 master-0 kubenswrapper[7518]: I0319 09:21:10.510784 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e70442bc-7032-4700-9b0b-9f71acce25ad" (UID: "e70442bc-7032-4700-9b0b-9f71acce25ad"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:10.511448 master-0 kubenswrapper[7518]: I0319 09:21:10.511384 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:10.511448 master-0 kubenswrapper[7518]: I0319 09:21:10.511420 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e70442bc-7032-4700-9b0b-9f71acce25ad-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:10.515309 master-0 kubenswrapper[7518]: I0319 09:21:10.515193 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70442bc-7032-4700-9b0b-9f71acce25ad-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e70442bc-7032-4700-9b0b-9f71acce25ad" (UID: "e70442bc-7032-4700-9b0b-9f71acce25ad"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:10.611844 master-0 kubenswrapper[7518]: I0319 09:21:10.611768 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e70442bc-7032-4700-9b0b-9f71acce25ad-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:11.019273 master-0 kubenswrapper[7518]: I0319 09:21:11.019188 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_e70442bc-7032-4700-9b0b-9f71acce25ad/installer/0.log" Mar 19 09:21:11.019273 master-0 kubenswrapper[7518]: I0319 09:21:11.019254 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"e70442bc-7032-4700-9b0b-9f71acce25ad","Type":"ContainerDied","Data":"f8ac3b71437154c471149874ab3e7f3a947283e3919139d78279c41d7256c32d"} Mar 19 09:21:11.019273 master-0 kubenswrapper[7518]: I0319 09:21:11.019294 7518 scope.go:117] "RemoveContainer" containerID="cedcc447f352ca925d6be4191ca4c2529ff3315af9ffcd9eb08176f813b434f8" Mar 19 09:21:11.021953 master-0 kubenswrapper[7518]: I0319 09:21:11.019412 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 09:21:11.366304 master-0 kubenswrapper[7518]: I0319 09:21:11.366137 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5"] Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: I0319 09:21:14.112664 7518 patch_prober.go:28] interesting pod/apiserver-54cd8888b9-q4ztg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]log ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]etcd ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/max-in-flight-filter ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/openshift.io-startinformers ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: livez check failed Mar 19 09:21:14.114768 master-0 kubenswrapper[7518]: I0319 09:21:14.112819 7518 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" podUID="3a4fd337-c385-4f56-965c-d68ee0a4e848" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:21:14.190859 master-0 kubenswrapper[7518]: I0319 09:21:14.189680 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:14.195495 master-0 kubenswrapper[7518]: I0319 09:21:14.193124 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:14.195495 master-0 kubenswrapper[7518]: E0319 09:21:14.193332 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70442bc-7032-4700-9b0b-9f71acce25ad" containerName="installer" Mar 19 09:21:14.195495 master-0 kubenswrapper[7518]: I0319 09:21:14.193345 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70442bc-7032-4700-9b0b-9f71acce25ad" containerName="installer" Mar 19 09:21:14.195495 master-0 kubenswrapper[7518]: I0319 09:21:14.193428 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70442bc-7032-4700-9b0b-9f71acce25ad" containerName="installer" Mar 19 09:21:14.195495 master-0 kubenswrapper[7518]: I0319 09:21:14.193822 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.229717 master-0 kubenswrapper[7518]: I0319 09:21:14.226896 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 09:21:14.250530 master-0 kubenswrapper[7518]: I0319 09:21:14.234531 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:14.267540 master-0 kubenswrapper[7518]: I0319 09:21:14.266613 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kube-api-access\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.267540 master-0 kubenswrapper[7518]: I0319 09:21:14.266668 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-var-lock\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.267540 master-0 kubenswrapper[7518]: I0319 09:21:14.266699 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.364549 master-0 kubenswrapper[7518]: I0319 09:21:14.359602 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70442bc-7032-4700-9b0b-9f71acce25ad" path="/var/lib/kubelet/pods/e70442bc-7032-4700-9b0b-9f71acce25ad/volumes" Mar 19 09:21:14.376218 master-0 kubenswrapper[7518]: I0319 09:21:14.375934 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kube-api-access\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.376218 master-0 kubenswrapper[7518]: I0319 09:21:14.376034 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-var-lock\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.376218 master-0 kubenswrapper[7518]: I0319 09:21:14.376054 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.376218 master-0 kubenswrapper[7518]: I0319 09:21:14.376128 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.376865 master-0 kubenswrapper[7518]: I0319 09:21:14.376770 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-var-lock\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.446625 master-0 kubenswrapper[7518]: I0319 09:21:14.446570 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kube-api-access\") pod \"installer-3-master-0\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:14.498695 master-0 kubenswrapper[7518]: I0319 09:21:14.494427 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2"] Mar 19 09:21:14.498695 master-0 kubenswrapper[7518]: I0319 09:21:14.494684 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" podUID="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" containerName="cluster-version-operator" containerID="cri-o://c7e68cb3271256a9333d55ffab578a3758ec1fbf9021fe986d32592d8ec62834" gracePeriod=130 Mar 19 09:21:14.576536 master-0 kubenswrapper[7518]: I0319 09:21:14.571092 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:15.050591 master-0 kubenswrapper[7518]: I0319 09:21:15.050536 7518 generic.go:334] "Generic (PLEG): container finished" podID="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" containerID="c7e68cb3271256a9333d55ffab578a3758ec1fbf9021fe986d32592d8ec62834" exitCode=0 Mar 19 09:21:15.050821 master-0 kubenswrapper[7518]: I0319 09:21:15.050645 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" event={"ID":"32b1ae47-ef83-448d-b40d-a836cb6c6fc0","Type":"ContainerDied","Data":"c7e68cb3271256a9333d55ffab578a3758ec1fbf9021fe986d32592d8ec62834"} Mar 19 09:21:15.053148 master-0 kubenswrapper[7518]: I0319 09:21:15.052508 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" event={"ID":"92a3a7fe-9b83-4f48-aa8e-ad1618f75388","Type":"ContainerStarted","Data":"1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0"} Mar 19 09:21:15.053148 master-0 kubenswrapper[7518]: I0319 09:21:15.052558 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" event={"ID":"92a3a7fe-9b83-4f48-aa8e-ad1618f75388","Type":"ContainerStarted","Data":"cb1b0e98b52bdef9f348c83014e1a1d0690f840c8349dd937aabbdb11ee4d3d4"} Mar 19 09:21:15.053148 master-0 kubenswrapper[7518]: I0319 09:21:15.052811 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:15.060209 master-0 kubenswrapper[7518]: I0319 09:21:15.056640 7518 generic.go:334] "Generic (PLEG): container finished" podID="8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823" containerID="4903db04251051a54ad7e347003826304ccc0327af5e8e5393199af2a3df5cfe" exitCode=0 Mar 19 09:21:15.060209 master-0 kubenswrapper[7518]: I0319 09:21:15.056725 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" event={"ID":"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823","Type":"ContainerDied","Data":"4903db04251051a54ad7e347003826304ccc0327af5e8e5393199af2a3df5cfe"} Mar 19 09:21:15.060209 master-0 kubenswrapper[7518]: I0319 09:21:15.057169 7518 scope.go:117] "RemoveContainer" containerID="4903db04251051a54ad7e347003826304ccc0327af5e8e5393199af2a3df5cfe" Mar 19 09:21:15.071876 master-0 kubenswrapper[7518]: I0319 09:21:15.071124 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p88qq" event={"ID":"b8f39c16-3a94-45c3-a51c-f2e81eff967d","Type":"ContainerStarted","Data":"7063f64e15737418d1aaec77049d2f1792d9efa2b0b3859ba062b6eaad6cadcc"} Mar 19 09:21:15.071876 master-0 kubenswrapper[7518]: I0319 09:21:15.071169 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p88qq" event={"ID":"b8f39c16-3a94-45c3-a51c-f2e81eff967d","Type":"ContainerStarted","Data":"0bdb69501b280316b591fb961cd30f4536933bc30fb8b12f95a1464975f2d4e2"} Mar 19 09:21:15.071876 master-0 kubenswrapper[7518]: I0319 09:21:15.071686 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:15.273081 master-0 kubenswrapper[7518]: I0319 09:21:15.273021 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:15.273795 master-0 kubenswrapper[7518]: I0319 09:21:15.273655 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" podStartSLOduration=17.273632725 podStartE2EDuration="17.273632725s" podCreationTimestamp="2026-03-19 09:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:15.269138253 +0000 UTC m=+93.151721542" watchObservedRunningTime="2026-03-19 09:21:15.273632725 +0000 UTC m=+93.156215984" Mar 19 09:21:15.281538 master-0 kubenswrapper[7518]: I0319 09:21:15.281431 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:21:15.281916 master-0 kubenswrapper[7518]: W0319 09:21:15.281880 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaaad607f_196d_4b6d_9919_d691cdbf1fc1.slice/crio-d6b9337a90215f782430cc02f441d2e3379a66d1ef66339554d1a48b38bb6681 WatchSource:0}: Error finding container d6b9337a90215f782430cc02f441d2e3379a66d1ef66339554d1a48b38bb6681: Status 404 returned error can't find the container with id d6b9337a90215f782430cc02f441d2e3379a66d1ef66339554d1a48b38bb6681 Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.407455 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") pod \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.407597 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") pod \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.407673 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") pod \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.407709 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") pod \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.407740 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") pod \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\" (UID: \"32b1ae47-ef83-448d-b40d-a836cb6c6fc0\") " Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.407844 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "32b1ae47-ef83-448d-b40d-a836cb6c6fc0" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:15.408093 master-0 kubenswrapper[7518]: I0319 09:21:15.408020 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "32b1ae47-ef83-448d-b40d-a836cb6c6fc0" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:15.408928 master-0 kubenswrapper[7518]: I0319 09:21:15.408570 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "32b1ae47-ef83-448d-b40d-a836cb6c6fc0" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:15.410687 master-0 kubenswrapper[7518]: I0319 09:21:15.410662 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "32b1ae47-ef83-448d-b40d-a836cb6c6fc0" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:15.412100 master-0 kubenswrapper[7518]: I0319 09:21:15.412057 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32b1ae47-ef83-448d-b40d-a836cb6c6fc0" (UID: "32b1ae47-ef83-448d-b40d-a836cb6c6fc0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:15.510500 master-0 kubenswrapper[7518]: I0319 09:21:15.509175 7518 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:15.510500 master-0 kubenswrapper[7518]: I0319 09:21:15.509228 7518 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:15.510500 master-0 kubenswrapper[7518]: I0319 09:21:15.509238 7518 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:15.510500 master-0 kubenswrapper[7518]: I0319 09:21:15.509249 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:15.510500 master-0 kubenswrapper[7518]: I0319 09:21:15.509261 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b1ae47-ef83-448d-b40d-a836cb6c6fc0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:15.769419 master-0 kubenswrapper[7518]: I0319 09:21:15.769290 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:21:15.769788 master-0 kubenswrapper[7518]: E0319 09:21:15.769772 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" containerName="cluster-version-operator" Mar 19 09:21:15.769850 master-0 kubenswrapper[7518]: I0319 09:21:15.769841 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" containerName="cluster-version-operator" Mar 19 09:21:15.769984 master-0 kubenswrapper[7518]: I0319 09:21:15.769972 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" containerName="cluster-version-operator" Mar 19 09:21:15.770439 master-0 kubenswrapper[7518]: I0319 09:21:15.770420 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.774961 master-0 kubenswrapper[7518]: I0319 09:21:15.773938 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:15.812534 master-0 kubenswrapper[7518]: I0319 09:21:15.812444 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-var-lock\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.812898 master-0 kubenswrapper[7518]: I0319 09:21:15.812632 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.812898 master-0 kubenswrapper[7518]: I0319 09:21:15.812746 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.832939 master-0 kubenswrapper[7518]: I0319 09:21:15.832847 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p88qq" podStartSLOduration=5.072330555 podStartE2EDuration="14.832825616s" podCreationTimestamp="2026-03-19 09:21:01 +0000 UTC" firstStartedPulling="2026-03-19 09:21:04.339642666 +0000 UTC m=+82.222225925" lastFinishedPulling="2026-03-19 09:21:14.100137727 +0000 UTC m=+91.982720986" observedRunningTime="2026-03-19 09:21:15.832664351 +0000 UTC m=+93.715247610" watchObservedRunningTime="2026-03-19 09:21:15.832825616 +0000 UTC m=+93.715408875" Mar 19 09:21:15.836439 master-0 kubenswrapper[7518]: I0319 09:21:15.835778 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:15.913749 master-0 kubenswrapper[7518]: I0319 09:21:15.913696 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.914159 master-0 kubenswrapper[7518]: I0319 09:21:15.914117 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-var-lock\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.914324 master-0 kubenswrapper[7518]: I0319 09:21:15.914203 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-var-lock\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.914391 master-0 kubenswrapper[7518]: I0319 09:21:15.914304 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:15.914735 master-0 kubenswrapper[7518]: I0319 09:21:15.914521 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.040244 master-0 kubenswrapper[7518]: I0319 09:21:16.040125 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:21:16.080744 master-0 kubenswrapper[7518]: I0319 09:21:16.080700 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"aaad607f-196d-4b6d-9919-d691cdbf1fc1","Type":"ContainerStarted","Data":"a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5"} Mar 19 09:21:16.081018 master-0 kubenswrapper[7518]: I0319 09:21:16.080998 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"aaad607f-196d-4b6d-9919-d691cdbf1fc1","Type":"ContainerStarted","Data":"d6b9337a90215f782430cc02f441d2e3379a66d1ef66339554d1a48b38bb6681"} Mar 19 09:21:16.082228 master-0 kubenswrapper[7518]: I0319 09:21:16.082207 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" event={"ID":"32b1ae47-ef83-448d-b40d-a836cb6c6fc0","Type":"ContainerDied","Data":"e8d72b34e27d40c589a01f72d5d166b2daee8cc6371b989889cbb67dad2e3fcc"} Mar 19 09:21:16.082381 master-0 kubenswrapper[7518]: I0319 09:21:16.082220 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2" Mar 19 09:21:16.082499 master-0 kubenswrapper[7518]: I0319 09:21:16.082345 7518 scope.go:117] "RemoveContainer" containerID="c7e68cb3271256a9333d55ffab578a3758ec1fbf9021fe986d32592d8ec62834" Mar 19 09:21:16.084336 master-0 kubenswrapper[7518]: I0319 09:21:16.084298 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" event={"ID":"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823","Type":"ContainerStarted","Data":"c0063b7bd48757c8036dac601a564741ec0978350580d1f9ab872a270a04b1cc"} Mar 19 09:21:16.257123 master-0 kubenswrapper[7518]: I0319 09:21:16.257056 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.299586 master-0 kubenswrapper[7518]: I0319 09:21:16.298382 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2"] Mar 19 09:21:16.303574 master-0 kubenswrapper[7518]: I0319 09:21:16.303403 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-sbhx2"] Mar 19 09:21:16.346492 master-0 kubenswrapper[7518]: I0319 09:21:16.345841 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b1ae47-ef83-448d-b40d-a836cb6c6fc0" path="/var/lib/kubelet/pods/32b1ae47-ef83-448d-b40d-a836cb6c6fc0/volumes" Mar 19 09:21:16.386496 master-0 kubenswrapper[7518]: I0319 09:21:16.385624 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:16.411496 master-0 kubenswrapper[7518]: I0319 09:21:16.404233 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.404212941 podStartE2EDuration="2.404212941s" podCreationTimestamp="2026-03-19 09:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:16.397312358 +0000 UTC m=+94.279895627" watchObservedRunningTime="2026-03-19 09:21:16.404212941 +0000 UTC m=+94.286796210" Mar 19 09:21:16.426981 master-0 kubenswrapper[7518]: I0319 09:21:16.426923 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7d58488df-thkn2"] Mar 19 09:21:16.427634 master-0 kubenswrapper[7518]: I0319 09:21:16.427588 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.435669 master-0 kubenswrapper[7518]: I0319 09:21:16.430729 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:21:16.435669 master-0 kubenswrapper[7518]: I0319 09:21:16.431089 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:21:16.435669 master-0 kubenswrapper[7518]: I0319 09:21:16.433038 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:21:16.526791 master-0 kubenswrapper[7518]: I0319 09:21:16.523012 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.526791 master-0 kubenswrapper[7518]: I0319 09:21:16.523149 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc9945ac-4041-4120-b504-a173c2bf91bd-serving-cert\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.526791 master-0 kubenswrapper[7518]: I0319 09:21:16.523272 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc9945ac-4041-4120-b504-a173c2bf91bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.526791 master-0 kubenswrapper[7518]: I0319 09:21:16.523344 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc9945ac-4041-4120-b504-a173c2bf91bd-service-ca\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.526791 master-0 kubenswrapper[7518]: I0319 09:21:16.523377 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.644242 master-0 kubenswrapper[7518]: I0319 09:21:16.644182 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc9945ac-4041-4120-b504-a173c2bf91bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.644242 master-0 kubenswrapper[7518]: I0319 09:21:16.644239 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc9945ac-4041-4120-b504-a173c2bf91bd-service-ca\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.644551 master-0 kubenswrapper[7518]: I0319 09:21:16.644263 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.644551 master-0 kubenswrapper[7518]: I0319 09:21:16.644325 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.644551 master-0 kubenswrapper[7518]: I0319 09:21:16.644357 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc9945ac-4041-4120-b504-a173c2bf91bd-serving-cert\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.645411 master-0 kubenswrapper[7518]: I0319 09:21:16.645338 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.645518 master-0 kubenswrapper[7518]: I0319 09:21:16.645434 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.646584 master-0 kubenswrapper[7518]: I0319 09:21:16.646156 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc9945ac-4041-4120-b504-a173c2bf91bd-service-ca\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.649136 master-0 kubenswrapper[7518]: I0319 09:21:16.649094 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc9945ac-4041-4120-b504-a173c2bf91bd-serving-cert\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.679575 master-0 kubenswrapper[7518]: I0319 09:21:16.679494 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc9945ac-4041-4120-b504-a173c2bf91bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.725745 master-0 kubenswrapper[7518]: I0319 09:21:16.725681 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:21:16.735784 master-0 kubenswrapper[7518]: W0319 09:21:16.735221 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc7a807d4_04b4_40ec_b855_5aea08b58bcd.slice/crio-718e025466e104d8976f4d87e1922f752df5dac18e0a4a9bb53767720efd6215 WatchSource:0}: Error finding container 718e025466e104d8976f4d87e1922f752df5dac18e0a4a9bb53767720efd6215: Status 404 returned error can't find the container with id 718e025466e104d8976f4d87e1922f752df5dac18e0a4a9bb53767720efd6215 Mar 19 09:21:16.773929 master-0 kubenswrapper[7518]: I0319 09:21:16.773883 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:21:16.795263 master-0 kubenswrapper[7518]: W0319 09:21:16.793958 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9945ac_4041_4120_b504_a173c2bf91bd.slice/crio-3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5 WatchSource:0}: Error finding container 3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5: Status 404 returned error can't find the container with id 3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5 Mar 19 09:21:17.091606 master-0 kubenswrapper[7518]: I0319 09:21:17.091370 7518 generic.go:334] "Generic (PLEG): container finished" podID="357980ba-1957-412f-afb5-04281eca2bee" containerID="fdd9285acae300c3c00a66ae69c66c3dae68ae6703f408d0bdc875283085bf0e" exitCode=0 Mar 19 09:21:17.091606 master-0 kubenswrapper[7518]: I0319 09:21:17.091441 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" event={"ID":"357980ba-1957-412f-afb5-04281eca2bee","Type":"ContainerDied","Data":"fdd9285acae300c3c00a66ae69c66c3dae68ae6703f408d0bdc875283085bf0e"} Mar 19 09:21:17.092343 master-0 kubenswrapper[7518]: I0319 09:21:17.092306 7518 scope.go:117] "RemoveContainer" containerID="fdd9285acae300c3c00a66ae69c66c3dae68ae6703f408d0bdc875283085bf0e" Mar 19 09:21:17.094038 master-0 kubenswrapper[7518]: I0319 09:21:17.093915 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c7a807d4-04b4-40ec-b855-5aea08b58bcd","Type":"ContainerStarted","Data":"ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3"} Mar 19 09:21:17.094091 master-0 kubenswrapper[7518]: I0319 09:21:17.093960 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c7a807d4-04b4-40ec-b855-5aea08b58bcd","Type":"ContainerStarted","Data":"718e025466e104d8976f4d87e1922f752df5dac18e0a4a9bb53767720efd6215"} Mar 19 09:21:17.095813 master-0 kubenswrapper[7518]: I0319 09:21:17.095764 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" event={"ID":"dc9945ac-4041-4120-b504-a173c2bf91bd","Type":"ContainerStarted","Data":"98a3c757d25e72af4783071b99abecca447c92902cb290dff70e2427905c98a0"} Mar 19 09:21:17.095813 master-0 kubenswrapper[7518]: I0319 09:21:17.095794 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" event={"ID":"dc9945ac-4041-4120-b504-a173c2bf91bd","Type":"ContainerStarted","Data":"3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5"} Mar 19 09:21:17.171528 master-0 kubenswrapper[7518]: I0319 09:21:17.166185 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" podStartSLOduration=1.166166984 podStartE2EDuration="1.166166984s" podCreationTimestamp="2026-03-19 09:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:17.164682813 +0000 UTC m=+95.047266092" watchObservedRunningTime="2026-03-19 09:21:17.166166984 +0000 UTC m=+95.048750243" Mar 19 09:21:17.192036 master-0 kubenswrapper[7518]: I0319 09:21:17.191568 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=2.191525712 podStartE2EDuration="2.191525712s" podCreationTimestamp="2026-03-19 09:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:17.188607903 +0000 UTC m=+95.071191152" watchObservedRunningTime="2026-03-19 09:21:17.191525712 +0000 UTC m=+95.074108981" Mar 19 09:21:17.419374 master-0 kubenswrapper[7518]: I0319 09:21:17.418313 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fcf878b4-mjm86"] Mar 19 09:21:17.419374 master-0 kubenswrapper[7518]: I0319 09:21:17.418610 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" podUID="de27a71b-4736-46c2-8de7-d409fa52685d" containerName="controller-manager" containerID="cri-o://61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e" gracePeriod=30 Mar 19 09:21:17.449289 master-0 kubenswrapper[7518]: I0319 09:21:17.449232 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5"] Mar 19 09:21:17.811309 master-0 kubenswrapper[7518]: I0319 09:21:17.811045 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:17.869061 master-0 kubenswrapper[7518]: I0319 09:21:17.868989 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-config\") pod \"de27a71b-4736-46c2-8de7-d409fa52685d\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " Mar 19 09:21:17.869287 master-0 kubenswrapper[7518]: I0319 09:21:17.869116 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jltpk\" (UniqueName: \"kubernetes.io/projected/de27a71b-4736-46c2-8de7-d409fa52685d-kube-api-access-jltpk\") pod \"de27a71b-4736-46c2-8de7-d409fa52685d\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " Mar 19 09:21:17.869287 master-0 kubenswrapper[7518]: I0319 09:21:17.869160 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de27a71b-4736-46c2-8de7-d409fa52685d-serving-cert\") pod \"de27a71b-4736-46c2-8de7-d409fa52685d\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " Mar 19 09:21:17.869287 master-0 kubenswrapper[7518]: I0319 09:21:17.869198 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-client-ca\") pod \"de27a71b-4736-46c2-8de7-d409fa52685d\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " Mar 19 09:21:17.869287 master-0 kubenswrapper[7518]: I0319 09:21:17.869223 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-proxy-ca-bundles\") pod \"de27a71b-4736-46c2-8de7-d409fa52685d\" (UID: \"de27a71b-4736-46c2-8de7-d409fa52685d\") " Mar 19 09:21:17.870210 master-0 kubenswrapper[7518]: I0319 09:21:17.870179 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "de27a71b-4736-46c2-8de7-d409fa52685d" (UID: "de27a71b-4736-46c2-8de7-d409fa52685d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:17.870390 master-0 kubenswrapper[7518]: I0319 09:21:17.870323 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-config" (OuterVolumeSpecName: "config") pod "de27a71b-4736-46c2-8de7-d409fa52685d" (UID: "de27a71b-4736-46c2-8de7-d409fa52685d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:17.870698 master-0 kubenswrapper[7518]: I0319 09:21:17.870648 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-client-ca" (OuterVolumeSpecName: "client-ca") pod "de27a71b-4736-46c2-8de7-d409fa52685d" (UID: "de27a71b-4736-46c2-8de7-d409fa52685d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:17.872955 master-0 kubenswrapper[7518]: I0319 09:21:17.872896 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de27a71b-4736-46c2-8de7-d409fa52685d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "de27a71b-4736-46c2-8de7-d409fa52685d" (UID: "de27a71b-4736-46c2-8de7-d409fa52685d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:17.873153 master-0 kubenswrapper[7518]: I0319 09:21:17.873077 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de27a71b-4736-46c2-8de7-d409fa52685d-kube-api-access-jltpk" (OuterVolumeSpecName: "kube-api-access-jltpk") pod "de27a71b-4736-46c2-8de7-d409fa52685d" (UID: "de27a71b-4736-46c2-8de7-d409fa52685d"). InnerVolumeSpecName "kube-api-access-jltpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:17.973161 master-0 kubenswrapper[7518]: I0319 09:21:17.970251 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:17.973161 master-0 kubenswrapper[7518]: I0319 09:21:17.970304 7518 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:17.973161 master-0 kubenswrapper[7518]: I0319 09:21:17.970324 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de27a71b-4736-46c2-8de7-d409fa52685d-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:17.973161 master-0 kubenswrapper[7518]: I0319 09:21:17.970339 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jltpk\" (UniqueName: \"kubernetes.io/projected/de27a71b-4736-46c2-8de7-d409fa52685d-kube-api-access-jltpk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:17.973161 master-0 kubenswrapper[7518]: I0319 09:21:17.970349 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de27a71b-4736-46c2-8de7-d409fa52685d-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:18.113039 master-0 kubenswrapper[7518]: I0319 09:21:18.112858 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" event={"ID":"357980ba-1957-412f-afb5-04281eca2bee","Type":"ContainerStarted","Data":"8f937bf2a32923428b71421021158b825e34a4e1c42099873b4f7d29047736a5"} Mar 19 09:21:18.114843 master-0 kubenswrapper[7518]: I0319 09:21:18.114785 7518 generic.go:334] "Generic (PLEG): container finished" podID="de27a71b-4736-46c2-8de7-d409fa52685d" containerID="61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e" exitCode=0 Mar 19 09:21:18.115401 master-0 kubenswrapper[7518]: I0319 09:21:18.115302 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" Mar 19 09:21:18.118268 master-0 kubenswrapper[7518]: I0319 09:21:18.117649 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" event={"ID":"de27a71b-4736-46c2-8de7-d409fa52685d","Type":"ContainerDied","Data":"61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e"} Mar 19 09:21:18.118268 master-0 kubenswrapper[7518]: I0319 09:21:18.117710 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fcf878b4-mjm86" event={"ID":"de27a71b-4736-46c2-8de7-d409fa52685d","Type":"ContainerDied","Data":"d1a947a6e4cfe0689bee11c725d7351d0259fe7b72181ce7fe0ec6b785ca7c59"} Mar 19 09:21:18.118268 master-0 kubenswrapper[7518]: I0319 09:21:18.117732 7518 scope.go:117] "RemoveContainer" containerID="61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e" Mar 19 09:21:18.118268 master-0 kubenswrapper[7518]: I0319 09:21:18.118026 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" podUID="92a3a7fe-9b83-4f48-aa8e-ad1618f75388" containerName="route-controller-manager" containerID="cri-o://1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0" gracePeriod=30 Mar 19 09:21:18.144080 master-0 kubenswrapper[7518]: I0319 09:21:18.144049 7518 scope.go:117] "RemoveContainer" containerID="61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e" Mar 19 09:21:18.148156 master-0 kubenswrapper[7518]: E0319 09:21:18.148081 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e\": container with ID starting with 61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e not found: ID does not exist" containerID="61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e" Mar 19 09:21:18.148281 master-0 kubenswrapper[7518]: I0319 09:21:18.148171 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e"} err="failed to get container status \"61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e\": rpc error: code = NotFound desc = could not find container \"61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e\": container with ID starting with 61655463d3ea2337e16e72704a18d55f443f3be89ab38dc73df65fdeb43d090e not found: ID does not exist" Mar 19 09:21:18.197567 master-0 kubenswrapper[7518]: I0319 09:21:18.197322 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fcf878b4-mjm86"] Mar 19 09:21:18.202904 master-0 kubenswrapper[7518]: I0319 09:21:18.202861 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fcf878b4-mjm86"] Mar 19 09:21:18.340195 master-0 kubenswrapper[7518]: I0319 09:21:18.339132 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de27a71b-4736-46c2-8de7-d409fa52685d" path="/var/lib/kubelet/pods/de27a71b-4736-46c2-8de7-d409fa52685d/volumes" Mar 19 09:21:18.533811 master-0 kubenswrapper[7518]: I0319 09:21:18.533763 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:18.551706 master-0 kubenswrapper[7518]: I0319 09:21:18.551662 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:18.556954 master-0 kubenswrapper[7518]: I0319 09:21:18.556911 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:21:18.594208 master-0 kubenswrapper[7518]: I0319 09:21:18.593792 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-serving-cert\") pod \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " Mar 19 09:21:18.594208 master-0 kubenswrapper[7518]: I0319 09:21:18.593881 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-client-ca\") pod \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " Mar 19 09:21:18.594208 master-0 kubenswrapper[7518]: I0319 09:21:18.593915 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-config\") pod \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " Mar 19 09:21:18.594208 master-0 kubenswrapper[7518]: I0319 09:21:18.593940 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clr2t\" (UniqueName: \"kubernetes.io/projected/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-kube-api-access-clr2t\") pod \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\" (UID: \"92a3a7fe-9b83-4f48-aa8e-ad1618f75388\") " Mar 19 09:21:18.595170 master-0 kubenswrapper[7518]: I0319 09:21:18.594725 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-client-ca" (OuterVolumeSpecName: "client-ca") pod "92a3a7fe-9b83-4f48-aa8e-ad1618f75388" (UID: "92a3a7fe-9b83-4f48-aa8e-ad1618f75388"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:18.595170 master-0 kubenswrapper[7518]: I0319 09:21:18.594882 7518 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:18.595170 master-0 kubenswrapper[7518]: I0319 09:21:18.594960 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-config" (OuterVolumeSpecName: "config") pod "92a3a7fe-9b83-4f48-aa8e-ad1618f75388" (UID: "92a3a7fe-9b83-4f48-aa8e-ad1618f75388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:21:18.596930 master-0 kubenswrapper[7518]: I0319 09:21:18.596899 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-kube-api-access-clr2t" (OuterVolumeSpecName: "kube-api-access-clr2t") pod "92a3a7fe-9b83-4f48-aa8e-ad1618f75388" (UID: "92a3a7fe-9b83-4f48-aa8e-ad1618f75388"). InnerVolumeSpecName "kube-api-access-clr2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:18.597176 master-0 kubenswrapper[7518]: I0319 09:21:18.597151 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "92a3a7fe-9b83-4f48-aa8e-ad1618f75388" (UID: "92a3a7fe-9b83-4f48-aa8e-ad1618f75388"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:21:18.698593 master-0 kubenswrapper[7518]: I0319 09:21:18.697117 7518 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:18.698593 master-0 kubenswrapper[7518]: I0319 09:21:18.697156 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:18.698593 master-0 kubenswrapper[7518]: I0319 09:21:18.697167 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clr2t\" (UniqueName: \"kubernetes.io/projected/92a3a7fe-9b83-4f48-aa8e-ad1618f75388-kube-api-access-clr2t\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:19.123357 master-0 kubenswrapper[7518]: I0319 09:21:19.123028 7518 generic.go:334] "Generic (PLEG): container finished" podID="92a3a7fe-9b83-4f48-aa8e-ad1618f75388" containerID="1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0" exitCode=0 Mar 19 09:21:19.123357 master-0 kubenswrapper[7518]: I0319 09:21:19.123117 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" event={"ID":"92a3a7fe-9b83-4f48-aa8e-ad1618f75388","Type":"ContainerDied","Data":"1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0"} Mar 19 09:21:19.123357 master-0 kubenswrapper[7518]: I0319 09:21:19.123162 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" event={"ID":"92a3a7fe-9b83-4f48-aa8e-ad1618f75388","Type":"ContainerDied","Data":"cb1b0e98b52bdef9f348c83014e1a1d0690f840c8349dd937aabbdb11ee4d3d4"} Mar 19 09:21:19.123357 master-0 kubenswrapper[7518]: I0319 09:21:19.123181 7518 scope.go:117] "RemoveContainer" containerID="1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0" Mar 19 09:21:19.123357 master-0 kubenswrapper[7518]: I0319 09:21:19.123318 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5" Mar 19 09:21:19.138492 master-0 kubenswrapper[7518]: I0319 09:21:19.138445 7518 scope.go:117] "RemoveContainer" containerID="1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0" Mar 19 09:21:19.138998 master-0 kubenswrapper[7518]: E0319 09:21:19.138961 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0\": container with ID starting with 1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0 not found: ID does not exist" containerID="1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0" Mar 19 09:21:19.139065 master-0 kubenswrapper[7518]: I0319 09:21:19.139008 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0"} err="failed to get container status \"1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0\": rpc error: code = NotFound desc = could not find container \"1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0\": container with ID starting with 1630809dcbfaad6361f93fdd89a41d149a094ee51e56eca64cdeddc5a1c4f0f0 not found: ID does not exist" Mar 19 09:21:19.313131 master-0 kubenswrapper[7518]: I0319 09:21:19.313063 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5"] Mar 19 09:21:19.320613 master-0 kubenswrapper[7518]: I0319 09:21:19.320371 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb"] Mar 19 09:21:19.320613 master-0 kubenswrapper[7518]: E0319 09:21:19.320612 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a3a7fe-9b83-4f48-aa8e-ad1618f75388" containerName="route-controller-manager" Mar 19 09:21:19.320987 master-0 kubenswrapper[7518]: I0319 09:21:19.320631 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a3a7fe-9b83-4f48-aa8e-ad1618f75388" containerName="route-controller-manager" Mar 19 09:21:19.320987 master-0 kubenswrapper[7518]: E0319 09:21:19.320658 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27a71b-4736-46c2-8de7-d409fa52685d" containerName="controller-manager" Mar 19 09:21:19.320987 master-0 kubenswrapper[7518]: I0319 09:21:19.320666 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27a71b-4736-46c2-8de7-d409fa52685d" containerName="controller-manager" Mar 19 09:21:19.320987 master-0 kubenswrapper[7518]: I0319 09:21:19.320769 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="de27a71b-4736-46c2-8de7-d409fa52685d" containerName="controller-manager" Mar 19 09:21:19.320987 master-0 kubenswrapper[7518]: I0319 09:21:19.320784 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a3a7fe-9b83-4f48-aa8e-ad1618f75388" containerName="route-controller-manager" Mar 19 09:21:19.321200 master-0 kubenswrapper[7518]: I0319 09:21:19.321176 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.321928 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f9655dc5d-8lp25"] Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.322567 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.322922 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.323394 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.323509 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.323638 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.323710 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.325993 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.326155 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.327253 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.327459 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:21:19.328056 master-0 kubenswrapper[7518]: I0319 09:21:19.327695 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:21:19.330020 master-0 kubenswrapper[7518]: I0319 09:21:19.329984 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-686585f447-gm2z5"] Mar 19 09:21:19.333064 master-0 kubenswrapper[7518]: I0319 09:21:19.333024 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:21:19.408546 master-0 kubenswrapper[7518]: I0319 09:21:19.408285 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.408546 master-0 kubenswrapper[7518]: I0319 09:21:19.408349 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.408546 master-0 kubenswrapper[7518]: I0319 09:21:19.408373 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.408823 master-0 kubenswrapper[7518]: I0319 09:21:19.408580 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.408823 master-0 kubenswrapper[7518]: I0319 09:21:19.408629 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.408823 master-0 kubenswrapper[7518]: I0319 09:21:19.408674 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.408912 master-0 kubenswrapper[7518]: I0319 09:21:19.408812 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.409126 master-0 kubenswrapper[7518]: I0319 09:21:19.408948 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.409126 master-0 kubenswrapper[7518]: I0319 09:21:19.409034 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.509792 master-0 kubenswrapper[7518]: I0319 09:21:19.509742 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.509792 master-0 kubenswrapper[7518]: I0319 09:21:19.509791 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.509848 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.509879 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.509914 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.509942 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.509977 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.509999 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.510248 master-0 kubenswrapper[7518]: I0319 09:21:19.510024 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.511424 master-0 kubenswrapper[7518]: I0319 09:21:19.511228 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.512158 master-0 kubenswrapper[7518]: I0319 09:21:19.511518 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.512158 master-0 kubenswrapper[7518]: I0319 09:21:19.512016 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.512158 master-0 kubenswrapper[7518]: I0319 09:21:19.512067 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.514957 master-0 kubenswrapper[7518]: I0319 09:21:19.513735 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.514957 master-0 kubenswrapper[7518]: I0319 09:21:19.514864 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:19.515560 master-0 kubenswrapper[7518]: I0319 09:21:19.515395 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:19.609721 master-0 kubenswrapper[7518]: I0319 09:21:19.606918 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb"] Mar 19 09:21:19.609721 master-0 kubenswrapper[7518]: I0319 09:21:19.608871 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f9655dc5d-8lp25"] Mar 19 09:21:20.133704 master-0 kubenswrapper[7518]: I0319 09:21:20.133603 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_8edda930-b012-4f1f-977a-a71ef8763fe3/installer/0.log" Mar 19 09:21:20.133704 master-0 kubenswrapper[7518]: I0319 09:21:20.133657 7518 generic.go:334] "Generic (PLEG): container finished" podID="8edda930-b012-4f1f-977a-a71ef8763fe3" containerID="bfdd507a44c29b0cf95c9bc532b3b91ef64c10d4c5165041299ded0e08cc28ac" exitCode=1 Mar 19 09:21:20.133704 master-0 kubenswrapper[7518]: I0319 09:21:20.133690 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8edda930-b012-4f1f-977a-a71ef8763fe3","Type":"ContainerDied","Data":"bfdd507a44c29b0cf95c9bc532b3b91ef64c10d4c5165041299ded0e08cc28ac"} Mar 19 09:21:20.211104 master-0 kubenswrapper[7518]: I0319 09:21:20.211044 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:20.216967 master-0 kubenswrapper[7518]: I0319 09:21:20.216531 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:20.245991 master-0 kubenswrapper[7518]: I0319 09:21:20.244837 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:20.259271 master-0 kubenswrapper[7518]: I0319 09:21:20.258787 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:20.333288 master-0 kubenswrapper[7518]: I0319 09:21:20.333201 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a3a7fe-9b83-4f48-aa8e-ad1618f75388" path="/var/lib/kubelet/pods/92a3a7fe-9b83-4f48-aa8e-ad1618f75388/volumes" Mar 19 09:21:20.524251 master-0 kubenswrapper[7518]: I0319 09:21:20.524215 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_8edda930-b012-4f1f-977a-a71ef8763fe3/installer/0.log" Mar 19 09:21:20.524383 master-0 kubenswrapper[7518]: I0319 09:21:20.524277 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:21:20.635509 master-0 kubenswrapper[7518]: I0319 09:21:20.635066 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-var-lock\") pod \"8edda930-b012-4f1f-977a-a71ef8763fe3\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " Mar 19 09:21:20.635509 master-0 kubenswrapper[7518]: I0319 09:21:20.635151 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8edda930-b012-4f1f-977a-a71ef8763fe3-kube-api-access\") pod \"8edda930-b012-4f1f-977a-a71ef8763fe3\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " Mar 19 09:21:20.635509 master-0 kubenswrapper[7518]: I0319 09:21:20.635208 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-kubelet-dir\") pod \"8edda930-b012-4f1f-977a-a71ef8763fe3\" (UID: \"8edda930-b012-4f1f-977a-a71ef8763fe3\") " Mar 19 09:21:20.635509 master-0 kubenswrapper[7518]: I0319 09:21:20.635221 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-var-lock" (OuterVolumeSpecName: "var-lock") pod "8edda930-b012-4f1f-977a-a71ef8763fe3" (UID: "8edda930-b012-4f1f-977a-a71ef8763fe3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:20.635509 master-0 kubenswrapper[7518]: I0319 09:21:20.635350 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8edda930-b012-4f1f-977a-a71ef8763fe3" (UID: "8edda930-b012-4f1f-977a-a71ef8763fe3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:20.636159 master-0 kubenswrapper[7518]: I0319 09:21:20.635540 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:20.636159 master-0 kubenswrapper[7518]: I0319 09:21:20.635558 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8edda930-b012-4f1f-977a-a71ef8763fe3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:20.638336 master-0 kubenswrapper[7518]: I0319 09:21:20.638286 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edda930-b012-4f1f-977a-a71ef8763fe3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8edda930-b012-4f1f-977a-a71ef8763fe3" (UID: "8edda930-b012-4f1f-977a-a71ef8763fe3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:20.738248 master-0 kubenswrapper[7518]: I0319 09:21:20.738192 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8edda930-b012-4f1f-977a-a71ef8763fe3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:20.747446 master-0 kubenswrapper[7518]: I0319 09:21:20.747399 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb"] Mar 19 09:21:20.844839 master-0 kubenswrapper[7518]: I0319 09:21:20.844716 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f9655dc5d-8lp25"] Mar 19 09:21:20.846755 master-0 kubenswrapper[7518]: W0319 09:21:20.846702 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d017ee_b94e_402f_90c1_ccb3f336b2a8.slice/crio-5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207 WatchSource:0}: Error finding container 5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207: Status 404 returned error can't find the container with id 5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207 Mar 19 09:21:21.140853 master-0 kubenswrapper[7518]: I0319 09:21:21.140726 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" event={"ID":"fedd4b33-c90e-42d5-bc29-73d1701bb671","Type":"ContainerStarted","Data":"e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d"} Mar 19 09:21:21.140853 master-0 kubenswrapper[7518]: I0319 09:21:21.140777 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" event={"ID":"fedd4b33-c90e-42d5-bc29-73d1701bb671","Type":"ContainerStarted","Data":"dd1819a433e70ea4c2b01b165e8a76f6644d7959ff5dbef7efb1f362b56038c1"} Mar 19 09:21:21.141086 master-0 kubenswrapper[7518]: I0319 09:21:21.140965 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:21.144531 master-0 kubenswrapper[7518]: I0319 09:21:21.144156 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_8edda930-b012-4f1f-977a-a71ef8763fe3/installer/0.log" Mar 19 09:21:21.144531 master-0 kubenswrapper[7518]: I0319 09:21:21.144269 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8edda930-b012-4f1f-977a-a71ef8763fe3","Type":"ContainerDied","Data":"b2515e8d783e89d55f85d85bd5d14ced809801e3538f9daafbc170ce9d11b9e0"} Mar 19 09:21:21.144531 master-0 kubenswrapper[7518]: I0319 09:21:21.144296 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 09:21:21.144531 master-0 kubenswrapper[7518]: I0319 09:21:21.144309 7518 scope.go:117] "RemoveContainer" containerID="bfdd507a44c29b0cf95c9bc532b3b91ef64c10d4c5165041299ded0e08cc28ac" Mar 19 09:21:21.147612 master-0 kubenswrapper[7518]: I0319 09:21:21.146101 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:21:21.151098 master-0 kubenswrapper[7518]: I0319 09:21:21.150085 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" event={"ID":"01d017ee-b94e-402f-90c1-ccb3f336b2a8","Type":"ContainerStarted","Data":"00f2488d3b13e4e27e3e63246f1f84387bf26e062f88b6e05117ccf0841ee905"} Mar 19 09:21:21.151098 master-0 kubenswrapper[7518]: I0319 09:21:21.150138 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" event={"ID":"01d017ee-b94e-402f-90c1-ccb3f336b2a8","Type":"ContainerStarted","Data":"5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207"} Mar 19 09:21:21.151098 master-0 kubenswrapper[7518]: I0319 09:21:21.150406 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:21.156682 master-0 kubenswrapper[7518]: I0319 09:21:21.156310 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:21:21.159455 master-0 kubenswrapper[7518]: I0319 09:21:21.159382 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" podStartSLOduration=4.159360724 podStartE2EDuration="4.159360724s" podCreationTimestamp="2026-03-19 09:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:21.156800607 +0000 UTC m=+99.039383876" watchObservedRunningTime="2026-03-19 09:21:21.159360724 +0000 UTC m=+99.041943983" Mar 19 09:21:21.193374 master-0 kubenswrapper[7518]: I0319 09:21:21.193291 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" podStartSLOduration=4.193272901 podStartE2EDuration="4.193272901s" podCreationTimestamp="2026-03-19 09:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:21.192274697 +0000 UTC m=+99.074857976" watchObservedRunningTime="2026-03-19 09:21:21.193272901 +0000 UTC m=+99.075856160" Mar 19 09:21:21.285499 master-0 kubenswrapper[7518]: I0319 09:21:21.274431 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:21:21.285499 master-0 kubenswrapper[7518]: I0319 09:21:21.282588 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 09:21:21.582904 master-0 kubenswrapper[7518]: I0319 09:21:21.582857 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 09:21:21.583128 master-0 kubenswrapper[7518]: E0319 09:21:21.583030 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edda930-b012-4f1f-977a-a71ef8763fe3" containerName="installer" Mar 19 09:21:21.583128 master-0 kubenswrapper[7518]: I0319 09:21:21.583041 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edda930-b012-4f1f-977a-a71ef8763fe3" containerName="installer" Mar 19 09:21:21.583128 master-0 kubenswrapper[7518]: I0319 09:21:21.583127 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edda930-b012-4f1f-977a-a71ef8763fe3" containerName="installer" Mar 19 09:21:21.583730 master-0 kubenswrapper[7518]: I0319 09:21:21.583444 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.587092 master-0 kubenswrapper[7518]: I0319 09:21:21.587067 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:21:21.628538 master-0 kubenswrapper[7518]: I0319 09:21:21.628461 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 09:21:21.647766 master-0 kubenswrapper[7518]: I0319 09:21:21.647712 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.648295 master-0 kubenswrapper[7518]: I0319 09:21:21.647797 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.648295 master-0 kubenswrapper[7518]: I0319 09:21:21.647954 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-var-lock\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.749857 master-0 kubenswrapper[7518]: I0319 09:21:21.749808 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.750056 master-0 kubenswrapper[7518]: I0319 09:21:21.749927 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-var-lock\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.750056 master-0 kubenswrapper[7518]: I0319 09:21:21.749980 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.750056 master-0 kubenswrapper[7518]: I0319 09:21:21.750051 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.750382 master-0 kubenswrapper[7518]: I0319 09:21:21.750360 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-var-lock\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.784007 master-0 kubenswrapper[7518]: I0319 09:21:21.783806 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:21.898779 master-0 kubenswrapper[7518]: I0319 09:21:21.898544 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:21:22.236159 master-0 kubenswrapper[7518]: I0319 09:21:22.236098 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:22.236502 master-0 kubenswrapper[7518]: I0319 09:21:22.236289 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="aaad607f-196d-4b6d-9919-d691cdbf1fc1" containerName="installer" containerID="cri-o://a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5" gracePeriod=30 Mar 19 09:21:22.322268 master-0 kubenswrapper[7518]: I0319 09:21:22.322215 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edda930-b012-4f1f-977a-a71ef8763fe3" path="/var/lib/kubelet/pods/8edda930-b012-4f1f-977a-a71ef8763fe3/volumes" Mar 19 09:21:22.338638 master-0 kubenswrapper[7518]: I0319 09:21:22.334803 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 09:21:22.343896 master-0 kubenswrapper[7518]: W0319 09:21:22.343836 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0df23b55_3dea_4f5e_9d53_5c7755ea4e48.slice/crio-0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194 WatchSource:0}: Error finding container 0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194: Status 404 returned error can't find the container with id 0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194 Mar 19 09:21:22.709543 master-0 kubenswrapper[7518]: I0319 09:21:22.706217 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_aaad607f-196d-4b6d-9919-d691cdbf1fc1/installer/0.log" Mar 19 09:21:22.709543 master-0 kubenswrapper[7518]: I0319 09:21:22.706323 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:22.766229 master-0 kubenswrapper[7518]: I0319 09:21:22.766092 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kubelet-dir\") pod \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " Mar 19 09:21:22.766438 master-0 kubenswrapper[7518]: I0319 09:21:22.766245 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kube-api-access\") pod \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " Mar 19 09:21:22.766438 master-0 kubenswrapper[7518]: I0319 09:21:22.766326 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-var-lock\") pod \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\" (UID: \"aaad607f-196d-4b6d-9919-d691cdbf1fc1\") " Mar 19 09:21:22.767891 master-0 kubenswrapper[7518]: I0319 09:21:22.766692 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-var-lock" (OuterVolumeSpecName: "var-lock") pod "aaad607f-196d-4b6d-9919-d691cdbf1fc1" (UID: "aaad607f-196d-4b6d-9919-d691cdbf1fc1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:22.767891 master-0 kubenswrapper[7518]: I0319 09:21:22.766741 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aaad607f-196d-4b6d-9919-d691cdbf1fc1" (UID: "aaad607f-196d-4b6d-9919-d691cdbf1fc1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:22.770762 master-0 kubenswrapper[7518]: I0319 09:21:22.770703 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aaad607f-196d-4b6d-9919-d691cdbf1fc1" (UID: "aaad607f-196d-4b6d-9919-d691cdbf1fc1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:22.868422 master-0 kubenswrapper[7518]: I0319 09:21:22.868183 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:22.868422 master-0 kubenswrapper[7518]: I0319 09:21:22.868236 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:22.868422 master-0 kubenswrapper[7518]: I0319 09:21:22.868253 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aaad607f-196d-4b6d-9919-d691cdbf1fc1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:23.160025 master-0 kubenswrapper[7518]: I0319 09:21:23.159864 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"0df23b55-3dea-4f5e-9d53-5c7755ea4e48","Type":"ContainerStarted","Data":"cbac5fecef5ccbfed911c8dc762330e4e21b1d157632cde1feee52ece3850c21"} Mar 19 09:21:23.160025 master-0 kubenswrapper[7518]: I0319 09:21:23.159913 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"0df23b55-3dea-4f5e-9d53-5c7755ea4e48","Type":"ContainerStarted","Data":"0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194"} Mar 19 09:21:23.161495 master-0 kubenswrapper[7518]: I0319 09:21:23.161425 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_aaad607f-196d-4b6d-9919-d691cdbf1fc1/installer/0.log" Mar 19 09:21:23.161650 master-0 kubenswrapper[7518]: I0319 09:21:23.161581 7518 generic.go:334] "Generic (PLEG): container finished" podID="aaad607f-196d-4b6d-9919-d691cdbf1fc1" containerID="a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5" exitCode=1 Mar 19 09:21:23.161703 master-0 kubenswrapper[7518]: I0319 09:21:23.161659 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 09:21:23.161746 master-0 kubenswrapper[7518]: I0319 09:21:23.161692 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"aaad607f-196d-4b6d-9919-d691cdbf1fc1","Type":"ContainerDied","Data":"a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5"} Mar 19 09:21:23.161795 master-0 kubenswrapper[7518]: I0319 09:21:23.161750 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"aaad607f-196d-4b6d-9919-d691cdbf1fc1","Type":"ContainerDied","Data":"d6b9337a90215f782430cc02f441d2e3379a66d1ef66339554d1a48b38bb6681"} Mar 19 09:21:23.161795 master-0 kubenswrapper[7518]: I0319 09:21:23.161779 7518 scope.go:117] "RemoveContainer" containerID="a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5" Mar 19 09:21:23.172915 master-0 kubenswrapper[7518]: I0319 09:21:23.172870 7518 scope.go:117] "RemoveContainer" containerID="a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5" Mar 19 09:21:23.173317 master-0 kubenswrapper[7518]: E0319 09:21:23.173285 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5\": container with ID starting with a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5 not found: ID does not exist" containerID="a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5" Mar 19 09:21:23.173444 master-0 kubenswrapper[7518]: I0319 09:21:23.173324 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5"} err="failed to get container status \"a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5\": rpc error: code = NotFound desc = could not find container \"a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5\": container with ID starting with a4050e3578d6285e1f7c78f93a8105aa707f9e733518f8a5afed8219e02095f5 not found: ID does not exist" Mar 19 09:21:23.227299 master-0 kubenswrapper[7518]: E0319 09:21:23.227221 7518 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podaaad607f_196d_4b6d_9919_d691cdbf1fc1.slice\": RecentStats: unable to find data in memory cache]" Mar 19 09:21:23.303863 master-0 kubenswrapper[7518]: I0319 09:21:23.303801 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=2.303783695 podStartE2EDuration="2.303783695s" podCreationTimestamp="2026-03-19 09:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:23.301304641 +0000 UTC m=+101.183887910" watchObservedRunningTime="2026-03-19 09:21:23.303783695 +0000 UTC m=+101.186366954" Mar 19 09:21:23.336117 master-0 kubenswrapper[7518]: I0319 09:21:23.336050 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:23.344806 master-0 kubenswrapper[7518]: I0319 09:21:23.344720 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 09:21:24.141644 master-0 kubenswrapper[7518]: I0319 09:21:24.141583 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p88qq" Mar 19 09:21:24.325010 master-0 kubenswrapper[7518]: I0319 09:21:24.324756 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaad607f-196d-4b6d-9919-d691cdbf1fc1" path="/var/lib/kubelet/pods/aaad607f-196d-4b6d-9919-d691cdbf1fc1/volumes" Mar 19 09:21:24.619497 master-0 kubenswrapper[7518]: I0319 09:21:24.619435 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 09:21:24.619694 master-0 kubenswrapper[7518]: E0319 09:21:24.619646 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaad607f-196d-4b6d-9919-d691cdbf1fc1" containerName="installer" Mar 19 09:21:24.619694 master-0 kubenswrapper[7518]: I0319 09:21:24.619662 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaad607f-196d-4b6d-9919-d691cdbf1fc1" containerName="installer" Mar 19 09:21:24.619781 master-0 kubenswrapper[7518]: I0319 09:21:24.619760 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaad607f-196d-4b6d-9919-d691cdbf1fc1" containerName="installer" Mar 19 09:21:24.620167 master-0 kubenswrapper[7518]: I0319 09:21:24.620139 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.621926 master-0 kubenswrapper[7518]: I0319 09:21:24.621884 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-jh786" Mar 19 09:21:24.623713 master-0 kubenswrapper[7518]: I0319 09:21:24.623672 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:21:24.643355 master-0 kubenswrapper[7518]: I0319 09:21:24.643302 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 09:21:24.689002 master-0 kubenswrapper[7518]: I0319 09:21:24.688927 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-var-lock\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.689002 master-0 kubenswrapper[7518]: I0319 09:21:24.689001 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.689254 master-0 kubenswrapper[7518]: I0319 09:21:24.689062 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de53594-9dcc-4318-806a-64f39ef76b3b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.790164 master-0 kubenswrapper[7518]: I0319 09:21:24.790107 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-var-lock\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.790164 master-0 kubenswrapper[7518]: I0319 09:21:24.790169 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.790444 master-0 kubenswrapper[7518]: I0319 09:21:24.790252 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-var-lock\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.790444 master-0 kubenswrapper[7518]: I0319 09:21:24.790349 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de53594-9dcc-4318-806a-64f39ef76b3b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.790444 master-0 kubenswrapper[7518]: I0319 09:21:24.790394 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.806355 master-0 kubenswrapper[7518]: I0319 09:21:24.806288 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de53594-9dcc-4318-806a-64f39ef76b3b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:24.933789 master-0 kubenswrapper[7518]: I0319 09:21:24.933731 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:21:25.299982 master-0 kubenswrapper[7518]: I0319 09:21:25.299853 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 09:21:26.178432 master-0 kubenswrapper[7518]: I0319 09:21:26.178374 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"2de53594-9dcc-4318-806a-64f39ef76b3b","Type":"ContainerStarted","Data":"921b32f57f187453279e5e8112c07cdf7b75d2182a8ace33d227749c1f7857e9"} Mar 19 09:21:26.178432 master-0 kubenswrapper[7518]: I0319 09:21:26.178431 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"2de53594-9dcc-4318-806a-64f39ef76b3b","Type":"ContainerStarted","Data":"846174bbc21aaf0dbb6863b67ef55a4060d549089aa7226a91ee6bec43a301c1"} Mar 19 09:21:26.201674 master-0 kubenswrapper[7518]: I0319 09:21:26.201453 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=2.201432496 podStartE2EDuration="2.201432496s" podCreationTimestamp="2026-03-19 09:21:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:26.200340346 +0000 UTC m=+104.082923605" watchObservedRunningTime="2026-03-19 09:21:26.201432496 +0000 UTC m=+104.084015755" Mar 19 09:21:26.321190 master-0 kubenswrapper[7518]: I0319 09:21:26.321152 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg"] Mar 19 09:21:26.321803 master-0 kubenswrapper[7518]: I0319 09:21:26.321788 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.325153 master-0 kubenswrapper[7518]: I0319 09:21:26.325122 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-jwt9n" Mar 19 09:21:26.327262 master-0 kubenswrapper[7518]: I0319 09:21:26.327189 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg"] Mar 19 09:21:26.423205 master-0 kubenswrapper[7518]: I0319 09:21:26.423150 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xl5z\" (UniqueName: \"kubernetes.io/projected/f8fdab32-4e61-4e9c-a506-52121f625669-kube-api-access-5xl5z\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.423367 master-0 kubenswrapper[7518]: I0319 09:21:26.423274 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f8fdab32-4e61-4e9c-a506-52121f625669-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.524317 master-0 kubenswrapper[7518]: I0319 09:21:26.524172 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xl5z\" (UniqueName: \"kubernetes.io/projected/f8fdab32-4e61-4e9c-a506-52121f625669-kube-api-access-5xl5z\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.524317 master-0 kubenswrapper[7518]: I0319 09:21:26.524261 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f8fdab32-4e61-4e9c-a506-52121f625669-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.527195 master-0 kubenswrapper[7518]: I0319 09:21:26.527148 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f8fdab32-4e61-4e9c-a506-52121f625669-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.545206 master-0 kubenswrapper[7518]: I0319 09:21:26.545111 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xl5z\" (UniqueName: \"kubernetes.io/projected/f8fdab32-4e61-4e9c-a506-52121f625669-kube-api-access-5xl5z\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:26.636266 master-0 kubenswrapper[7518]: I0319 09:21:26.636197 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:21:27.020267 master-0 kubenswrapper[7518]: I0319 09:21:27.020204 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg"] Mar 19 09:21:27.025903 master-0 kubenswrapper[7518]: W0319 09:21:27.025853 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fdab32_4e61_4e9c_a506_52121f625669.slice/crio-3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5 WatchSource:0}: Error finding container 3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5: Status 404 returned error can't find the container with id 3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5 Mar 19 09:21:27.184426 master-0 kubenswrapper[7518]: I0319 09:21:27.184270 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" event={"ID":"f8fdab32-4e61-4e9c-a506-52121f625669","Type":"ContainerStarted","Data":"3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5"} Mar 19 09:21:28.774966 master-0 kubenswrapper[7518]: I0319 09:21:28.774864 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:21:28.775378 master-0 kubenswrapper[7518]: I0319 09:21:28.775072 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="c7a807d4-04b4-40ec-b855-5aea08b58bcd" containerName="installer" containerID="cri-o://ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3" gracePeriod=30 Mar 19 09:21:29.202514 master-0 kubenswrapper[7518]: I0319 09:21:29.202431 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" event={"ID":"f8fdab32-4e61-4e9c-a506-52121f625669","Type":"ContainerStarted","Data":"acf283a7ae197d666c775ab08c887351d964d47ec0b92c301618c88bed643bf2"} Mar 19 09:21:29.202514 master-0 kubenswrapper[7518]: I0319 09:21:29.202518 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" event={"ID":"f8fdab32-4e61-4e9c-a506-52121f625669","Type":"ContainerStarted","Data":"609f0a30b6cdbd0eca29ca05457f09f3d96c6791f50aad172737012219c2834d"} Mar 19 09:21:29.220694 master-0 kubenswrapper[7518]: I0319 09:21:29.220599 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" podStartSLOduration=1.724498073 podStartE2EDuration="3.220580052s" podCreationTimestamp="2026-03-19 09:21:26 +0000 UTC" firstStartedPulling="2026-03-19 09:21:27.02808432 +0000 UTC m=+104.910667579" lastFinishedPulling="2026-03-19 09:21:28.524166289 +0000 UTC m=+106.406749558" observedRunningTime="2026-03-19 09:21:29.215918336 +0000 UTC m=+107.098501615" watchObservedRunningTime="2026-03-19 09:21:29.220580052 +0000 UTC m=+107.103163311" Mar 19 09:21:29.323062 master-0 kubenswrapper[7518]: I0319 09:21:29.322990 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b"] Mar 19 09:21:29.324314 master-0 kubenswrapper[7518]: E0319 09:21:29.323993 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[webhook-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" podUID="bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d" Mar 19 09:21:30.211016 master-0 kubenswrapper[7518]: I0319 09:21:30.210960 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-xhzf9_083882c0-ea2f-4405-8cf1-cce5b91fe602/openshift-controller-manager-operator/0.log" Mar 19 09:21:30.211016 master-0 kubenswrapper[7518]: I0319 09:21:30.211015 7518 generic.go:334] "Generic (PLEG): container finished" podID="083882c0-ea2f-4405-8cf1-cce5b91fe602" containerID="787b47766f4f361558a231cbdd8f60cfc309ddb2f5ce9e60ddd25ab14ca4bf8c" exitCode=1 Mar 19 09:21:30.211715 master-0 kubenswrapper[7518]: I0319 09:21:30.211078 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:21:30.211715 master-0 kubenswrapper[7518]: I0319 09:21:30.211080 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" event={"ID":"083882c0-ea2f-4405-8cf1-cce5b91fe602","Type":"ContainerDied","Data":"787b47766f4f361558a231cbdd8f60cfc309ddb2f5ce9e60ddd25ab14ca4bf8c"} Mar 19 09:21:30.211911 master-0 kubenswrapper[7518]: I0319 09:21:30.211857 7518 scope.go:117] "RemoveContainer" containerID="787b47766f4f361558a231cbdd8f60cfc309ddb2f5ce9e60ddd25ab14ca4bf8c" Mar 19 09:21:30.220923 master-0 kubenswrapper[7518]: I0319 09:21:30.220886 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:21:30.373101 master-0 kubenswrapper[7518]: I0319 09:21:30.372942 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn"] Mar 19 09:21:30.378363 master-0 kubenswrapper[7518]: I0319 09:21:30.373908 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.378363 master-0 kubenswrapper[7518]: I0319 09:21:30.377939 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:21:30.378363 master-0 kubenswrapper[7518]: I0319 09:21:30.378148 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:21:30.378637 master-0 kubenswrapper[7518]: I0319 09:21:30.378415 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:21:30.378637 master-0 kubenswrapper[7518]: I0319 09:21:30.378616 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-bgq5z" Mar 19 09:21:30.384372 master-0 kubenswrapper[7518]: I0319 09:21:30.383856 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") pod \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\" (UID: \"bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d\") " Mar 19 09:21:30.396640 master-0 kubenswrapper[7518]: I0319 09:21:30.388163 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42" (OuterVolumeSpecName: "kube-api-access-kxv42") pod "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d" (UID: "bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d"). InnerVolumeSpecName "kube-api-access-kxv42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:30.448234 master-0 kubenswrapper[7518]: I0319 09:21:30.448152 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn"] Mar 19 09:21:30.485928 master-0 kubenswrapper[7518]: I0319 09:21:30.485776 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d486ce23-acf7-429a-9739-4770e1a2bf78-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.486218 master-0 kubenswrapper[7518]: I0319 09:21:30.486128 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzdjs\" (UniqueName: \"kubernetes.io/projected/d486ce23-acf7-429a-9739-4770e1a2bf78-kube-api-access-bzdjs\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.486449 master-0 kubenswrapper[7518]: I0319 09:21:30.486415 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxv42\" (UniqueName: \"kubernetes.io/projected/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-kube-api-access-kxv42\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:30.587574 master-0 kubenswrapper[7518]: I0319 09:21:30.587480 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d486ce23-acf7-429a-9739-4770e1a2bf78-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.587774 master-0 kubenswrapper[7518]: I0319 09:21:30.587586 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdjs\" (UniqueName: \"kubernetes.io/projected/d486ce23-acf7-429a-9739-4770e1a2bf78-kube-api-access-bzdjs\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.602296 master-0 kubenswrapper[7518]: I0319 09:21:30.602236 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d486ce23-acf7-429a-9739-4770e1a2bf78-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.652419 master-0 kubenswrapper[7518]: I0319 09:21:30.652382 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzdjs\" (UniqueName: \"kubernetes.io/projected/d486ce23-acf7-429a-9739-4770e1a2bf78-kube-api-access-bzdjs\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:30.708867 master-0 kubenswrapper[7518]: I0319 09:21:30.708789 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:21:31.125613 master-0 kubenswrapper[7518]: I0319 09:21:31.125410 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn"] Mar 19 09:21:31.216214 master-0 kubenswrapper[7518]: I0319 09:21:31.215979 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" event={"ID":"d486ce23-acf7-429a-9739-4770e1a2bf78","Type":"ContainerStarted","Data":"2f229196290719614f7bbcbd70dc1d3eb6df4440414271052c0e25cb9764e057"} Mar 19 09:21:31.219579 master-0 kubenswrapper[7518]: I0319 09:21:31.218041 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-xhzf9_083882c0-ea2f-4405-8cf1-cce5b91fe602/openshift-controller-manager-operator/0.log" Mar 19 09:21:31.219579 master-0 kubenswrapper[7518]: I0319 09:21:31.218106 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b" Mar 19 09:21:31.223484 master-0 kubenswrapper[7518]: I0319 09:21:31.219834 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" event={"ID":"083882c0-ea2f-4405-8cf1-cce5b91fe602","Type":"ContainerStarted","Data":"394fddc10d9fdfc928b5b6beefe9ff72503e973fe67b6d5dcbf05abf24856450"} Mar 19 09:21:31.311524 master-0 kubenswrapper[7518]: I0319 09:21:31.311290 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b"] Mar 19 09:21:31.320419 master-0 kubenswrapper[7518]: I0319 09:21:31.320357 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-mc76b"] Mar 19 09:21:31.503150 master-0 kubenswrapper[7518]: I0319 09:21:31.503069 7518 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:31.572899 master-0 kubenswrapper[7518]: I0319 09:21:31.572846 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 09:21:31.573400 master-0 kubenswrapper[7518]: I0319 09:21:31.573374 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.575617 master-0 kubenswrapper[7518]: I0319 09:21:31.575572 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wv2vd" Mar 19 09:21:31.586516 master-0 kubenswrapper[7518]: I0319 09:21:31.584025 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 09:21:31.707062 master-0 kubenswrapper[7518]: I0319 09:21:31.706992 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc248e59-1519-4ac3-9005-2239214a8d62-kube-api-access\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.707435 master-0 kubenswrapper[7518]: I0319 09:21:31.707124 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-var-lock\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.707435 master-0 kubenswrapper[7518]: I0319 09:21:31.707156 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.808745 master-0 kubenswrapper[7518]: I0319 09:21:31.808555 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-var-lock\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.808745 master-0 kubenswrapper[7518]: I0319 09:21:31.808634 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.808745 master-0 kubenswrapper[7518]: I0319 09:21:31.808665 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc248e59-1519-4ac3-9005-2239214a8d62-kube-api-access\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.809237 master-0 kubenswrapper[7518]: I0319 09:21:31.808783 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.809237 master-0 kubenswrapper[7518]: I0319 09:21:31.808933 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-var-lock\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.826756 master-0 kubenswrapper[7518]: I0319 09:21:31.826663 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc248e59-1519-4ac3-9005-2239214a8d62-kube-api-access\") pod \"installer-2-master-0\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:31.916580 master-0 kubenswrapper[7518]: I0319 09:21:31.916441 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:21:32.326658 master-0 kubenswrapper[7518]: I0319 09:21:32.326597 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d" path="/var/lib/kubelet/pods/bf81a40d-9f45-4d54-a8dc-95dbfcd1f59d/volumes" Mar 19 09:21:32.327069 master-0 kubenswrapper[7518]: I0319 09:21:32.326991 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 09:21:32.329824 master-0 kubenswrapper[7518]: W0319 09:21:32.329764 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddc248e59_1519_4ac3_9005_2239214a8d62.slice/crio-430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82 WatchSource:0}: Error finding container 430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82: Status 404 returned error can't find the container with id 430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82 Mar 19 09:21:33.230903 master-0 kubenswrapper[7518]: I0319 09:21:33.230821 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"dc248e59-1519-4ac3-9005-2239214a8d62","Type":"ContainerStarted","Data":"2b23049d85d383fc87e2217ac4c88730e6accf178c37b42720c1211cad94765e"} Mar 19 09:21:33.230903 master-0 kubenswrapper[7518]: I0319 09:21:33.230873 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"dc248e59-1519-4ac3-9005-2239214a8d62","Type":"ContainerStarted","Data":"430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82"} Mar 19 09:21:33.256853 master-0 kubenswrapper[7518]: I0319 09:21:33.256743 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=2.256707017 podStartE2EDuration="2.256707017s" podCreationTimestamp="2026-03-19 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:21:33.247702723 +0000 UTC m=+111.130285982" watchObservedRunningTime="2026-03-19 09:21:33.256707017 +0000 UTC m=+111.139290276" Mar 19 09:21:33.687516 master-0 kubenswrapper[7518]: I0319 09:21:33.687410 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh"] Mar 19 09:21:33.689065 master-0 kubenswrapper[7518]: I0319 09:21:33.688414 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.696446 master-0 kubenswrapper[7518]: I0319 09:21:33.696385 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:21:33.696599 master-0 kubenswrapper[7518]: I0319 09:21:33.696446 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:21:33.696599 master-0 kubenswrapper[7518]: I0319 09:21:33.696501 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:21:33.696599 master-0 kubenswrapper[7518]: I0319 09:21:33.696557 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-l9t78" Mar 19 09:21:33.696599 master-0 kubenswrapper[7518]: I0319 09:21:33.696404 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:21:33.696844 master-0 kubenswrapper[7518]: I0319 09:21:33.696810 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:21:33.839406 master-0 kubenswrapper[7518]: I0319 09:21:33.839316 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.839406 master-0 kubenswrapper[7518]: I0319 09:21:33.839380 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-config\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.839795 master-0 kubenswrapper[7518]: I0319 09:21:33.839433 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/de509e3d-5e9c-47be-bce2-adc4f435aea8-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.839795 master-0 kubenswrapper[7518]: I0319 09:21:33.839525 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggm26\" (UniqueName: \"kubernetes.io/projected/de509e3d-5e9c-47be-bce2-adc4f435aea8-kube-api-access-ggm26\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.940529 master-0 kubenswrapper[7518]: I0319 09:21:33.940444 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/de509e3d-5e9c-47be-bce2-adc4f435aea8-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.940871 master-0 kubenswrapper[7518]: I0319 09:21:33.940569 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggm26\" (UniqueName: \"kubernetes.io/projected/de509e3d-5e9c-47be-bce2-adc4f435aea8-kube-api-access-ggm26\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.941221 master-0 kubenswrapper[7518]: I0319 09:21:33.941163 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.941346 master-0 kubenswrapper[7518]: I0319 09:21:33.941325 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-config\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.942482 master-0 kubenswrapper[7518]: I0319 09:21:33.942399 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.942576 master-0 kubenswrapper[7518]: I0319 09:21:33.942511 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-config\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.944481 master-0 kubenswrapper[7518]: I0319 09:21:33.944418 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/de509e3d-5e9c-47be-bce2-adc4f435aea8-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:33.963322 master-0 kubenswrapper[7518]: I0319 09:21:33.963259 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggm26\" (UniqueName: \"kubernetes.io/projected/de509e3d-5e9c-47be-bce2-adc4f435aea8-kube-api-access-ggm26\") pod \"machine-approver-6cb57bb5db-qkbqh\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:34.042337 master-0 kubenswrapper[7518]: I0319 09:21:34.042272 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:21:34.060132 master-0 kubenswrapper[7518]: W0319 09:21:34.060077 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde509e3d_5e9c_47be_bce2_adc4f435aea8.slice/crio-b6f21e047d7fe1c17012e8b0e2eccf0c0df41f1dd7af47ee16ae785f35047af4 WatchSource:0}: Error finding container b6f21e047d7fe1c17012e8b0e2eccf0c0df41f1dd7af47ee16ae785f35047af4: Status 404 returned error can't find the container with id b6f21e047d7fe1c17012e8b0e2eccf0c0df41f1dd7af47ee16ae785f35047af4 Mar 19 09:21:34.255716 master-0 kubenswrapper[7518]: I0319 09:21:34.255524 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" event={"ID":"de509e3d-5e9c-47be-bce2-adc4f435aea8","Type":"ContainerStarted","Data":"b6f21e047d7fe1c17012e8b0e2eccf0c0df41f1dd7af47ee16ae785f35047af4"} Mar 19 09:21:34.259290 master-0 kubenswrapper[7518]: I0319 09:21:34.259231 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" event={"ID":"d486ce23-acf7-429a-9739-4770e1a2bf78","Type":"ContainerStarted","Data":"2ea52482522c190b31e0c2a767d192b95134c189dc647d21749261e6e8b31d1e"} Mar 19 09:21:34.287348 master-0 kubenswrapper[7518]: I0319 09:21:34.287278 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" podStartSLOduration=1.8042390369999999 podStartE2EDuration="4.287260802s" podCreationTimestamp="2026-03-19 09:21:30 +0000 UTC" firstStartedPulling="2026-03-19 09:21:31.137954848 +0000 UTC m=+109.020538107" lastFinishedPulling="2026-03-19 09:21:33.620976613 +0000 UTC m=+111.503559872" observedRunningTime="2026-03-19 09:21:34.286913843 +0000 UTC m=+112.169497122" watchObservedRunningTime="2026-03-19 09:21:34.287260802 +0000 UTC m=+112.169844061" Mar 19 09:21:35.268331 master-0 kubenswrapper[7518]: I0319 09:21:35.268249 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" event={"ID":"de509e3d-5e9c-47be-bce2-adc4f435aea8","Type":"ContainerStarted","Data":"1cf12cc7445333b9b5a115e74969e46b1dc4dba3d931d8d5860393a5791b239a"} Mar 19 09:21:36.107491 master-0 kubenswrapper[7518]: I0319 09:21:36.106776 7518 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:21:36.107491 master-0 kubenswrapper[7518]: I0319 09:21:36.107022 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" containerID="cri-o://24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b" gracePeriod=30 Mar 19 09:21:36.107491 master-0 kubenswrapper[7518]: I0319 09:21:36.107164 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" containerID="cri-o://5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a" gracePeriod=30 Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: I0319 09:21:36.123057 7518 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: E0319 09:21:36.123255 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: I0319 09:21:36.123266 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: E0319 09:21:36.123273 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: I0319 09:21:36.123280 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: I0319 09:21:36.123370 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: I0319 09:21:36.123383 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 09:21:36.134136 master-0 kubenswrapper[7518]: I0319 09:21:36.124753 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.278069 master-0 kubenswrapper[7518]: I0319 09:21:36.277111 7518 generic.go:334] "Generic (PLEG): container finished" podID="86c4b0e4-3481-465d-b00f-022d2c58c183" containerID="f771ab2ec3cdd043d42f5957ed84808b36b0f576aa969f9e8666ac7eb9b0b134" exitCode=0 Mar 19 09:21:36.278069 master-0 kubenswrapper[7518]: I0319 09:21:36.277169 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" event={"ID":"86c4b0e4-3481-465d-b00f-022d2c58c183","Type":"ContainerDied","Data":"f771ab2ec3cdd043d42f5957ed84808b36b0f576aa969f9e8666ac7eb9b0b134"} Mar 19 09:21:36.278069 master-0 kubenswrapper[7518]: I0319 09:21:36.277553 7518 scope.go:117] "RemoveContainer" containerID="f771ab2ec3cdd043d42f5957ed84808b36b0f576aa969f9e8666ac7eb9b0b134" Mar 19 09:21:36.280545 master-0 kubenswrapper[7518]: I0319 09:21:36.280098 7518 generic.go:334] "Generic (PLEG): container finished" podID="a75049de-dcf1-4102-b339-f45d5015adea" containerID="239a4aff890f70e77543607e882c4861b3b7d9ef6cf1f395add14a0ad7fc62e0" exitCode=0 Mar 19 09:21:36.280545 master-0 kubenswrapper[7518]: I0319 09:21:36.280120 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" event={"ID":"a75049de-dcf1-4102-b339-f45d5015adea","Type":"ContainerDied","Data":"239a4aff890f70e77543607e882c4861b3b7d9ef6cf1f395add14a0ad7fc62e0"} Mar 19 09:21:36.280545 master-0 kubenswrapper[7518]: I0319 09:21:36.280326 7518 scope.go:117] "RemoveContainer" containerID="239a4aff890f70e77543607e882c4861b3b7d9ef6cf1f395add14a0ad7fc62e0" Mar 19 09:21:36.294408 master-0 kubenswrapper[7518]: I0319 09:21:36.294380 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.294612 master-0 kubenswrapper[7518]: I0319 09:21:36.294598 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.294725 master-0 kubenswrapper[7518]: I0319 09:21:36.294713 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.294819 master-0 kubenswrapper[7518]: I0319 09:21:36.294808 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.294926 master-0 kubenswrapper[7518]: I0319 09:21:36.294914 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.295166 master-0 kubenswrapper[7518]: I0319 09:21:36.295106 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.396345 master-0 kubenswrapper[7518]: I0319 09:21:36.396231 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.396682 master-0 kubenswrapper[7518]: I0319 09:21:36.396430 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.396994 master-0 kubenswrapper[7518]: I0319 09:21:36.396876 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.396994 master-0 kubenswrapper[7518]: I0319 09:21:36.396926 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.396994 master-0 kubenswrapper[7518]: I0319 09:21:36.396949 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.396994 master-0 kubenswrapper[7518]: I0319 09:21:36.396973 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.397169 master-0 kubenswrapper[7518]: I0319 09:21:36.397029 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.397169 master-0 kubenswrapper[7518]: I0319 09:21:36.397055 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.397169 master-0 kubenswrapper[7518]: I0319 09:21:36.397090 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.397169 master-0 kubenswrapper[7518]: I0319 09:21:36.397129 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.397283 master-0 kubenswrapper[7518]: I0319 09:21:36.397196 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:36.397283 master-0 kubenswrapper[7518]: I0319 09:21:36.397235 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:21:37.298802 master-0 kubenswrapper[7518]: I0319 09:21:37.298735 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" event={"ID":"a75049de-dcf1-4102-b339-f45d5015adea","Type":"ContainerStarted","Data":"42b9a79d42542a10355bd1a462df5ffb67f1a10eae7fe6919eb834123087d197"} Mar 19 09:21:37.308889 master-0 kubenswrapper[7518]: I0319 09:21:37.308812 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" event={"ID":"86c4b0e4-3481-465d-b00f-022d2c58c183","Type":"ContainerStarted","Data":"d8a756b9b58a3ce072eadde280ccd4f57de1077de86a738e2697b1425743281c"} Mar 19 09:21:38.322021 master-0 kubenswrapper[7518]: I0319 09:21:38.321939 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" event={"ID":"de509e3d-5e9c-47be-bce2-adc4f435aea8","Type":"ContainerStarted","Data":"8ea09204714320987ee497184ecc0341387802c177649f71cb1059afb0240745"} Mar 19 09:21:46.238021 master-0 kubenswrapper[7518]: E0319 09:21:46.237889 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cluster-monitoring-operator-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" podUID="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" Mar 19 09:21:46.239197 master-0 kubenswrapper[7518]: E0319 09:21:46.238005 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[srv-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" podUID="8aa0f17a-287e-4a19-9a59-4913e7707071" Mar 19 09:21:46.239197 master-0 kubenswrapper[7518]: E0319 09:21:46.238107 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[srv-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" podUID="208939f5-8fca-4fd5-b0c6-43484b7d1e30" Mar 19 09:21:46.239197 master-0 kubenswrapper[7518]: E0319 09:21:46.238228 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[package-server-manager-serving-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" podUID="1f2148fe-f9f6-47da-894c-b88dae360ebe" Mar 19 09:21:46.241108 master-0 kubenswrapper[7518]: E0319 09:21:46.241009 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:21:48.027731 master-0 kubenswrapper[7518]: I0319 09:21:48.027644 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_c7a807d4-04b4-40ec-b855-5aea08b58bcd/installer/0.log" Mar 19 09:21:48.027731 master-0 kubenswrapper[7518]: I0319 09:21:48.027740 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:48.168674 master-0 kubenswrapper[7518]: I0319 09:21:48.168569 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kubelet-dir\") pod \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " Mar 19 09:21:48.168917 master-0 kubenswrapper[7518]: I0319 09:21:48.168735 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kube-api-access\") pod \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " Mar 19 09:21:48.168917 master-0 kubenswrapper[7518]: I0319 09:21:48.168821 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-var-lock\") pod \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\" (UID: \"c7a807d4-04b4-40ec-b855-5aea08b58bcd\") " Mar 19 09:21:48.169240 master-0 kubenswrapper[7518]: I0319 09:21:48.169196 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-var-lock" (OuterVolumeSpecName: "var-lock") pod "c7a807d4-04b4-40ec-b855-5aea08b58bcd" (UID: "c7a807d4-04b4-40ec-b855-5aea08b58bcd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:48.169293 master-0 kubenswrapper[7518]: I0319 09:21:48.169250 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c7a807d4-04b4-40ec-b855-5aea08b58bcd" (UID: "c7a807d4-04b4-40ec-b855-5aea08b58bcd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:48.172278 master-0 kubenswrapper[7518]: I0319 09:21:48.172230 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c7a807d4-04b4-40ec-b855-5aea08b58bcd" (UID: "c7a807d4-04b4-40ec-b855-5aea08b58bcd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:48.270181 master-0 kubenswrapper[7518]: I0319 09:21:48.270022 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:48.270181 master-0 kubenswrapper[7518]: I0319 09:21:48.270077 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:48.270181 master-0 kubenswrapper[7518]: I0319 09:21:48.270090 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7a807d4-04b4-40ec-b855-5aea08b58bcd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:48.379814 master-0 kubenswrapper[7518]: I0319 09:21:48.379725 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_c7a807d4-04b4-40ec-b855-5aea08b58bcd/installer/0.log" Mar 19 09:21:48.379814 master-0 kubenswrapper[7518]: I0319 09:21:48.379808 7518 generic.go:334] "Generic (PLEG): container finished" podID="c7a807d4-04b4-40ec-b855-5aea08b58bcd" containerID="ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3" exitCode=1 Mar 19 09:21:48.380194 master-0 kubenswrapper[7518]: I0319 09:21:48.379847 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c7a807d4-04b4-40ec-b855-5aea08b58bcd","Type":"ContainerDied","Data":"ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3"} Mar 19 09:21:48.380194 master-0 kubenswrapper[7518]: I0319 09:21:48.379887 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 09:21:48.380194 master-0 kubenswrapper[7518]: I0319 09:21:48.379926 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"c7a807d4-04b4-40ec-b855-5aea08b58bcd","Type":"ContainerDied","Data":"718e025466e104d8976f4d87e1922f752df5dac18e0a4a9bb53767720efd6215"} Mar 19 09:21:48.380194 master-0 kubenswrapper[7518]: I0319 09:21:48.379951 7518 scope.go:117] "RemoveContainer" containerID="ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3" Mar 19 09:21:48.391636 master-0 kubenswrapper[7518]: I0319 09:21:48.391603 7518 scope.go:117] "RemoveContainer" containerID="ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3" Mar 19 09:21:48.392114 master-0 kubenswrapper[7518]: E0319 09:21:48.392070 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3\": container with ID starting with ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3 not found: ID does not exist" containerID="ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3" Mar 19 09:21:48.392173 master-0 kubenswrapper[7518]: I0319 09:21:48.392114 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3"} err="failed to get container status \"ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3\": rpc error: code = NotFound desc = could not find container \"ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3\": container with ID starting with ed582809a03fe74506e07d144b177da5b4d6ce62a658dab52428e9eb26519ac3 not found: ID does not exist" Mar 19 09:21:49.167745 master-0 kubenswrapper[7518]: E0319 09:21:49.167680 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:49.168370 master-0 kubenswrapper[7518]: I0319 09:21:49.168184 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:21:49.183828 master-0 kubenswrapper[7518]: W0319 09:21:49.183767 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b4ed170d527099878cb5fdd508a2fb.slice/crio-319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210 WatchSource:0}: Error finding container 319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210: Status 404 returned error can't find the container with id 319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210 Mar 19 09:21:49.385313 master-0 kubenswrapper[7518]: I0319 09:21:49.385121 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210"} Mar 19 09:21:50.395979 master-0 kubenswrapper[7518]: I0319 09:21:50.395916 7518 generic.go:334] "Generic (PLEG): container finished" podID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerID="89a2fc8df576416ddd348c57ed4c730f0abfa16882e2a3cc4358c65c4a9606ca" exitCode=0 Mar 19 09:21:50.396517 master-0 kubenswrapper[7518]: I0319 09:21:50.396052 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"259aa9cc-51a9-498e-b099-ba4d781801c5","Type":"ContainerDied","Data":"89a2fc8df576416ddd348c57ed4c730f0abfa16882e2a3cc4358c65c4a9606ca"} Mar 19 09:21:50.398843 master-0 kubenswrapper[7518]: I0319 09:21:50.398780 7518 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391" exitCode=0 Mar 19 09:21:50.398912 master-0 kubenswrapper[7518]: I0319 09:21:50.398855 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391"} Mar 19 09:21:51.407913 master-0 kubenswrapper[7518]: I0319 09:21:51.407813 7518 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="f432083e0bbefbf0b796c955a8b8a3248de20b6a5a5f87ee1ff2f03234e367ae" exitCode=1 Mar 19 09:21:51.408723 master-0 kubenswrapper[7518]: I0319 09:21:51.408071 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"f432083e0bbefbf0b796c955a8b8a3248de20b6a5a5f87ee1ff2f03234e367ae"} Mar 19 09:21:51.408723 master-0 kubenswrapper[7518]: I0319 09:21:51.408127 7518 scope.go:117] "RemoveContainer" containerID="45451316b1a7ec438f9d41dbda0f8c815892268dced06b008a0fa9fc13645266" Mar 19 09:21:51.408819 master-0 kubenswrapper[7518]: I0319 09:21:51.408737 7518 scope.go:117] "RemoveContainer" containerID="f432083e0bbefbf0b796c955a8b8a3248de20b6a5a5f87ee1ff2f03234e367ae" Mar 19 09:21:51.419462 master-0 kubenswrapper[7518]: I0319 09:21:51.419371 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:21:51.419638 master-0 kubenswrapper[7518]: I0319 09:21:51.419506 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:21:51.419638 master-0 kubenswrapper[7518]: I0319 09:21:51.419563 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:21:51.419638 master-0 kubenswrapper[7518]: I0319 09:21:51.419602 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:21:51.419865 master-0 kubenswrapper[7518]: I0319 09:21:51.419668 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:21:51.424898 master-0 kubenswrapper[7518]: I0319 09:21:51.424825 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:21:51.426181 master-0 kubenswrapper[7518]: I0319 09:21:51.426127 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:21:51.426763 master-0 kubenswrapper[7518]: I0319 09:21:51.426645 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:21:51.426958 master-0 kubenswrapper[7518]: I0319 09:21:51.426903 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:21:51.429996 master-0 kubenswrapper[7518]: I0319 09:21:51.429960 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:21:51.687400 master-0 kubenswrapper[7518]: I0319 09:21:51.687326 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:51.823924 master-0 kubenswrapper[7518]: I0319 09:21:51.823837 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-kubelet-dir\") pod \"259aa9cc-51a9-498e-b099-ba4d781801c5\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " Mar 19 09:21:51.824155 master-0 kubenswrapper[7518]: I0319 09:21:51.823991 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-var-lock\") pod \"259aa9cc-51a9-498e-b099-ba4d781801c5\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " Mar 19 09:21:51.824155 master-0 kubenswrapper[7518]: I0319 09:21:51.824026 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "259aa9cc-51a9-498e-b099-ba4d781801c5" (UID: "259aa9cc-51a9-498e-b099-ba4d781801c5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:51.824155 master-0 kubenswrapper[7518]: I0319 09:21:51.824037 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/259aa9cc-51a9-498e-b099-ba4d781801c5-kube-api-access\") pod \"259aa9cc-51a9-498e-b099-ba4d781801c5\" (UID: \"259aa9cc-51a9-498e-b099-ba4d781801c5\") " Mar 19 09:21:51.824390 master-0 kubenswrapper[7518]: I0319 09:21:51.824316 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:51.824390 master-0 kubenswrapper[7518]: I0319 09:21:51.824323 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-var-lock" (OuterVolumeSpecName: "var-lock") pod "259aa9cc-51a9-498e-b099-ba4d781801c5" (UID: "259aa9cc-51a9-498e-b099-ba4d781801c5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:21:51.827705 master-0 kubenswrapper[7518]: I0319 09:21:51.827669 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259aa9cc-51a9-498e-b099-ba4d781801c5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "259aa9cc-51a9-498e-b099-ba4d781801c5" (UID: "259aa9cc-51a9-498e-b099-ba4d781801c5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:21:51.926019 master-0 kubenswrapper[7518]: I0319 09:21:51.925801 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/259aa9cc-51a9-498e-b099-ba4d781801c5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:51.926019 master-0 kubenswrapper[7518]: I0319 09:21:51.925867 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/259aa9cc-51a9-498e-b099-ba4d781801c5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:21:52.414542 master-0 kubenswrapper[7518]: I0319 09:21:52.414455 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"259aa9cc-51a9-498e-b099-ba4d781801c5","Type":"ContainerDied","Data":"e5c5cd2c130a06e83a755f581cc3a20c2c3dce618468e51c158559ad4071da8b"} Mar 19 09:21:52.415237 master-0 kubenswrapper[7518]: I0319 09:21:52.414547 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c5cd2c130a06e83a755f581cc3a20c2c3dce618468e51c158559ad4071da8b" Mar 19 09:21:52.415237 master-0 kubenswrapper[7518]: I0319 09:21:52.414497 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:21:52.417367 master-0 kubenswrapper[7518]: I0319 09:21:52.417306 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8"} Mar 19 09:21:53.424699 master-0 kubenswrapper[7518]: I0319 09:21:53.424611 7518 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d" exitCode=1 Mar 19 09:21:53.424699 master-0 kubenswrapper[7518]: I0319 09:21:53.424692 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d"} Mar 19 09:21:53.425446 master-0 kubenswrapper[7518]: I0319 09:21:53.425377 7518 scope.go:117] "RemoveContainer" containerID="fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d" Mar 19 09:21:54.094019 master-0 kubenswrapper[7518]: E0319 09:21:54.093862 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:54.432851 master-0 kubenswrapper[7518]: I0319 09:21:54.432779 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"6081e5f52de3fc4dc3f746460dde01bf5beff21d46d2be6b213ee24cc51a7282"} Mar 19 09:21:55.424296 master-0 kubenswrapper[7518]: E0319 09:21:55.424080 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:45Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:45Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:45Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:21:45Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\\\"],\\\"sizeBytes\\\":470681292},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:21:57.315207 master-0 kubenswrapper[7518]: I0319 09:21:57.315085 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:21:57.316836 master-0 kubenswrapper[7518]: I0319 09:21:57.316015 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:21:58.324713 master-0 kubenswrapper[7518]: I0319 09:21:58.324562 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:21:59.019286 master-0 kubenswrapper[7518]: I0319 09:21:59.019207 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:21:59.315537 master-0 kubenswrapper[7518]: I0319 09:21:59.315379 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:21:59.315724 master-0 kubenswrapper[7518]: I0319 09:21:59.315648 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:21:59.316352 master-0 kubenswrapper[7518]: I0319 09:21:59.316335 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:21:59.316516 master-0 kubenswrapper[7518]: I0319 09:21:59.316391 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:22:00.315351 master-0 kubenswrapper[7518]: I0319 09:22:00.315284 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:22:00.316004 master-0 kubenswrapper[7518]: I0319 09:22:00.315916 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:22:00.316175 master-0 kubenswrapper[7518]: I0319 09:22:00.316136 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:22:00.317211 master-0 kubenswrapper[7518]: I0319 09:22:00.317194 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:22:02.020098 master-0 kubenswrapper[7518]: I0319 09:22:02.020012 7518 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:02.119346 master-0 kubenswrapper[7518]: I0319 09:22:02.119260 7518 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-ct498 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" start-of-body= Mar 19 09:22:02.119760 master-0 kubenswrapper[7518]: I0319 09:22:02.119404 7518 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" podUID="9663cc40-a69d-42ba-890e-071cb85062f5" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" Mar 19 09:22:03.406179 master-0 kubenswrapper[7518]: E0319 09:22:03.406058 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:03.487317 master-0 kubenswrapper[7518]: I0319 09:22:03.487206 7518 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a" exitCode=0 Mar 19 09:22:04.096222 master-0 kubenswrapper[7518]: E0319 09:22:04.096082 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:04.496152 master-0 kubenswrapper[7518]: I0319 09:22:04.496056 7518 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f" exitCode=0 Mar 19 09:22:04.496152 master-0 kubenswrapper[7518]: I0319 09:22:04.496107 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f"} Mar 19 09:22:05.425504 master-0 kubenswrapper[7518]: E0319 09:22:05.425388 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:06.253732 master-0 kubenswrapper[7518]: I0319 09:22:06.253648 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 09:22:06.254580 master-0 kubenswrapper[7518]: I0319 09:22:06.253793 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:22:06.325956 master-0 kubenswrapper[7518]: I0319 09:22:06.325860 7518 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:22:06.424883 master-0 kubenswrapper[7518]: I0319 09:22:06.424590 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 09:22:06.424883 master-0 kubenswrapper[7518]: I0319 09:22:06.424738 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs" (OuterVolumeSpecName: "certs") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:06.424883 master-0 kubenswrapper[7518]: I0319 09:22:06.424810 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 09:22:06.424883 master-0 kubenswrapper[7518]: I0319 09:22:06.424843 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir" (OuterVolumeSpecName: "data-dir") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:06.425658 master-0 kubenswrapper[7518]: I0319 09:22:06.425255 7518 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:06.425658 master-0 kubenswrapper[7518]: I0319 09:22:06.425289 7518 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:06.518194 master-0 kubenswrapper[7518]: I0319 09:22:06.518132 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 09:22:06.518547 master-0 kubenswrapper[7518]: I0319 09:22:06.518233 7518 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b" exitCode=137 Mar 19 09:22:06.518547 master-0 kubenswrapper[7518]: I0319 09:22:06.518403 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:22:08.531860 master-0 kubenswrapper[7518]: I0319 09:22:08.531758 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_0df23b55-3dea-4f5e-9d53-5c7755ea4e48/installer/0.log" Mar 19 09:22:08.531860 master-0 kubenswrapper[7518]: I0319 09:22:08.531847 7518 generic.go:334] "Generic (PLEG): container finished" podID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerID="cbac5fecef5ccbfed911c8dc762330e4e21b1d157632cde1feee52ece3850c21" exitCode=1 Mar 19 09:22:10.137197 master-0 kubenswrapper[7518]: E0319 09:22:10.136973 7518 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e33a71f3c5499 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:21:36.107156633 +0000 UTC m=+113.989739892,LastTimestamp:2026-03-19 09:21:36.107156633 +0000 UTC m=+113.989739892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:22:10.549925 master-0 kubenswrapper[7518]: I0319 09:22:10.548953 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_2de53594-9dcc-4318-806a-64f39ef76b3b/installer/0.log" Mar 19 09:22:10.549925 master-0 kubenswrapper[7518]: I0319 09:22:10.549027 7518 generic.go:334] "Generic (PLEG): container finished" podID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerID="921b32f57f187453279e5e8112c07cdf7b75d2182a8ace33d227749c1f7857e9" exitCode=1 Mar 19 09:22:12.019045 master-0 kubenswrapper[7518]: I0319 09:22:12.018961 7518 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:13.567869 master-0 kubenswrapper[7518]: I0319 09:22:13.567803 7518 generic.go:334] "Generic (PLEG): container finished" podID="a823c8bc-09ef-46a9-a1f3-155a34b89788" containerID="fe703627bf17490741c98c350c37ad5f26868d707caaf28e298dbcd09ba6eb50" exitCode=0 Mar 19 09:22:14.097680 master-0 kubenswrapper[7518]: E0319 09:22:14.097522 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:15.426256 master-0 kubenswrapper[7518]: E0319 09:22:15.426185 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:16.588740 master-0 kubenswrapper[7518]: I0319 09:22:16.588667 7518 generic.go:334] "Generic (PLEG): container finished" podID="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" containerID="b4cd172092883e2c59c413605caa9eda30c5b4011ddd9168033acc5dfa87297f" exitCode=0 Mar 19 09:22:17.505860 master-0 kubenswrapper[7518]: E0319 09:22:17.505745 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:18.601992 master-0 kubenswrapper[7518]: I0319 09:22:18.601912 7518 generic.go:334] "Generic (PLEG): container finished" podID="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" containerID="be05318150c766720e5d230c0bf2401720113751ff91aa74d2d72ed4d56c5f47" exitCode=0 Mar 19 09:22:18.604299 master-0 kubenswrapper[7518]: I0319 09:22:18.604259 7518 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9" exitCode=0 Mar 19 09:22:22.019543 master-0 kubenswrapper[7518]: I0319 09:22:22.019433 7518 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:23.637085 master-0 kubenswrapper[7518]: I0319 09:22:23.637019 7518 generic.go:334] "Generic (PLEG): container finished" podID="5b36f3b2-caf9-40ad-a3a1-e83796142f54" containerID="a9e3c64428edfb89f548d2d0f11b93a4546a142c8d9ea26eed5c6670f21e1d16" exitCode=0 Mar 19 09:22:24.098605 master-0 kubenswrapper[7518]: E0319 09:22:24.098250 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:25.426706 master-0 kubenswrapper[7518]: E0319 09:22:25.426619 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:26.658242 master-0 kubenswrapper[7518]: I0319 09:22:26.658166 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kqb2h_b2898746-6827-41d9-ac88-64206cb84ac9/approver/0.log" Mar 19 09:22:26.659035 master-0 kubenswrapper[7518]: I0319 09:22:26.658862 7518 generic.go:334] "Generic (PLEG): container finished" podID="b2898746-6827-41d9-ac88-64206cb84ac9" containerID="5f66b7b4498be8ffcef1be07d5415ae49ca99cf0c15b74518d97c2537613d5cc" exitCode=1 Mar 19 09:22:28.684199 master-0 kubenswrapper[7518]: I0319 09:22:28.684095 7518 generic.go:334] "Generic (PLEG): container finished" podID="9663cc40-a69d-42ba-890e-071cb85062f5" containerID="cdf18d2610050197f807cf4a5fc0308ba6a5aa77b434d76558194e6bb3ba81d0" exitCode=0 Mar 19 09:22:31.610404 master-0 kubenswrapper[7518]: E0319 09:22:31.610132 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:22:34.099900 master-0 kubenswrapper[7518]: E0319 09:22:34.099740 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:34.099900 master-0 kubenswrapper[7518]: I0319 09:22:34.099844 7518 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:22:35.427591 master-0 kubenswrapper[7518]: E0319 09:22:35.427388 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:35.427591 master-0 kubenswrapper[7518]: E0319 09:22:35.427439 7518 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:22:36.280124 master-0 kubenswrapper[7518]: I0319 09:22:36.280011 7518 status_manager.go:851] "Failed to get status for pod" podUID="86c4b0e4-3481-465d-b00f-022d2c58c183" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods openshift-apiserver-operator-d65958b8-96qpx)" Mar 19 09:22:40.330096 master-0 kubenswrapper[7518]: E0319 09:22:40.330020 7518 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:22:40.330741 master-0 kubenswrapper[7518]: E0319 09:22:40.330530 7518 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 19 09:22:40.331376 master-0 kubenswrapper[7518]: I0319 09:22:40.331328 7518 scope.go:117] "RemoveContainer" containerID="5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a" Mar 19 09:22:40.331461 master-0 kubenswrapper[7518]: I0319 09:22:40.331183 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"0df23b55-3dea-4f5e-9d53-5c7755ea4e48","Type":"ContainerDied","Data":"cbac5fecef5ccbfed911c8dc762330e4e21b1d157632cde1feee52ece3850c21"} Mar 19 09:22:40.331897 master-0 kubenswrapper[7518]: I0319 09:22:40.331556 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:22:40.331897 master-0 kubenswrapper[7518]: I0319 09:22:40.331585 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"2de53594-9dcc-4318-806a-64f39ef76b3b","Type":"ContainerDied","Data":"921b32f57f187453279e5e8112c07cdf7b75d2182a8ace33d227749c1f7857e9"} Mar 19 09:22:40.331897 master-0 kubenswrapper[7518]: I0319 09:22:40.331614 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" event={"ID":"a823c8bc-09ef-46a9-a1f3-155a34b89788","Type":"ContainerDied","Data":"fe703627bf17490741c98c350c37ad5f26868d707caaf28e298dbcd09ba6eb50"} Mar 19 09:22:40.331897 master-0 kubenswrapper[7518]: I0319 09:22:40.331633 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerDied","Data":"b4cd172092883e2c59c413605caa9eda30c5b4011ddd9168033acc5dfa87297f"} Mar 19 09:22:40.332196 master-0 kubenswrapper[7518]: I0319 09:22:40.332137 7518 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 09:22:40.332276 master-0 kubenswrapper[7518]: I0319 09:22:40.332216 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8" gracePeriod=30 Mar 19 09:22:40.332443 master-0 kubenswrapper[7518]: I0319 09:22:40.332369 7518 scope.go:117] "RemoveContainer" containerID="b4cd172092883e2c59c413605caa9eda30c5b4011ddd9168033acc5dfa87297f" Mar 19 09:22:40.335699 master-0 kubenswrapper[7518]: I0319 09:22:40.335623 7518 scope.go:117] "RemoveContainer" containerID="5f66b7b4498be8ffcef1be07d5415ae49ca99cf0c15b74518d97c2537613d5cc" Mar 19 09:22:40.338308 master-0 kubenswrapper[7518]: I0319 09:22:40.338269 7518 scope.go:117] "RemoveContainer" containerID="fe703627bf17490741c98c350c37ad5f26868d707caaf28e298dbcd09ba6eb50" Mar 19 09:22:40.339688 master-0 kubenswrapper[7518]: I0319 09:22:40.339617 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d664a6d0d2a24360dee10612610f1b59" path="/var/lib/kubelet/pods/d664a6d0d2a24360dee10612610f1b59/volumes" Mar 19 09:22:40.340401 master-0 kubenswrapper[7518]: I0319 09:22:40.340030 7518 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:22:40.357255 master-0 kubenswrapper[7518]: I0319 09:22:40.357224 7518 scope.go:117] "RemoveContainer" containerID="24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b" Mar 19 09:22:40.391556 master-0 kubenswrapper[7518]: I0319 09:22:40.391449 7518 scope.go:117] "RemoveContainer" containerID="5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a" Mar 19 09:22:40.392605 master-0 kubenswrapper[7518]: E0319 09:22:40.392452 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a\": container with ID starting with 5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a not found: ID does not exist" containerID="5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a" Mar 19 09:22:40.392605 master-0 kubenswrapper[7518]: I0319 09:22:40.392534 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a"} err="failed to get container status \"5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a\": rpc error: code = NotFound desc = could not find container \"5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a\": container with ID starting with 5aa3e736d36d7b5fc8fb93a72368cce6c129176806809c85a301748bc0aca23a not found: ID does not exist" Mar 19 09:22:40.392605 master-0 kubenswrapper[7518]: I0319 09:22:40.392568 7518 scope.go:117] "RemoveContainer" containerID="24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b" Mar 19 09:22:40.393634 master-0 kubenswrapper[7518]: E0319 09:22:40.393593 7518 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b\": container with ID starting with 24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b not found: ID does not exist" containerID="24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b" Mar 19 09:22:40.393853 master-0 kubenswrapper[7518]: I0319 09:22:40.393635 7518 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b"} err="failed to get container status \"24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b\": rpc error: code = NotFound desc = could not find container \"24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b\": container with ID starting with 24b2c17065d3c67eeab4e5a8a59d3f739b386413374d44a782d1ecd034cf1a1b not found: ID does not exist" Mar 19 09:22:40.767604 master-0 kubenswrapper[7518]: I0319 09:22:40.767431 7518 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8" exitCode=2 Mar 19 09:22:40.775645 master-0 kubenswrapper[7518]: I0319 09:22:40.775593 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kqb2h_b2898746-6827-41d9-ac88-64206cb84ac9/approver/0.log" Mar 19 09:22:41.029540 master-0 kubenswrapper[7518]: I0319 09:22:41.024695 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_2de53594-9dcc-4318-806a-64f39ef76b3b/installer/0.log" Mar 19 09:22:41.029540 master-0 kubenswrapper[7518]: I0319 09:22:41.024772 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:22:41.147037 master-0 kubenswrapper[7518]: I0319 09:22:41.146967 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_0df23b55-3dea-4f5e-9d53-5c7755ea4e48/installer/0.log" Mar 19 09:22:41.147379 master-0 kubenswrapper[7518]: I0319 09:22:41.147067 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:22:41.188239 master-0 kubenswrapper[7518]: I0319 09:22:41.188135 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-var-lock\") pod \"2de53594-9dcc-4318-806a-64f39ef76b3b\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " Mar 19 09:22:41.188239 master-0 kubenswrapper[7518]: I0319 09:22:41.188263 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-kubelet-dir\") pod \"2de53594-9dcc-4318-806a-64f39ef76b3b\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " Mar 19 09:22:41.188712 master-0 kubenswrapper[7518]: I0319 09:22:41.188293 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-var-lock" (OuterVolumeSpecName: "var-lock") pod "2de53594-9dcc-4318-806a-64f39ef76b3b" (UID: "2de53594-9dcc-4318-806a-64f39ef76b3b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:41.188712 master-0 kubenswrapper[7518]: I0319 09:22:41.188333 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2de53594-9dcc-4318-806a-64f39ef76b3b" (UID: "2de53594-9dcc-4318-806a-64f39ef76b3b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:41.188712 master-0 kubenswrapper[7518]: I0319 09:22:41.188349 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de53594-9dcc-4318-806a-64f39ef76b3b-kube-api-access\") pod \"2de53594-9dcc-4318-806a-64f39ef76b3b\" (UID: \"2de53594-9dcc-4318-806a-64f39ef76b3b\") " Mar 19 09:22:41.188712 master-0 kubenswrapper[7518]: I0319 09:22:41.188660 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:41.188712 master-0 kubenswrapper[7518]: I0319 09:22:41.188677 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2de53594-9dcc-4318-806a-64f39ef76b3b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:41.191960 master-0 kubenswrapper[7518]: I0319 09:22:41.191870 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de53594-9dcc-4318-806a-64f39ef76b3b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2de53594-9dcc-4318-806a-64f39ef76b3b" (UID: "2de53594-9dcc-4318-806a-64f39ef76b3b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:22:41.289636 master-0 kubenswrapper[7518]: I0319 09:22:41.289419 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-var-lock\") pod \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " Mar 19 09:22:41.289636 master-0 kubenswrapper[7518]: I0319 09:22:41.289577 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-var-lock" (OuterVolumeSpecName: "var-lock") pod "0df23b55-3dea-4f5e-9d53-5c7755ea4e48" (UID: "0df23b55-3dea-4f5e-9d53-5c7755ea4e48"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:41.289985 master-0 kubenswrapper[7518]: I0319 09:22:41.289673 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kubelet-dir\") pod \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " Mar 19 09:22:41.289985 master-0 kubenswrapper[7518]: I0319 09:22:41.289750 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kube-api-access\") pod \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\" (UID: \"0df23b55-3dea-4f5e-9d53-5c7755ea4e48\") " Mar 19 09:22:41.289985 master-0 kubenswrapper[7518]: I0319 09:22:41.289782 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0df23b55-3dea-4f5e-9d53-5c7755ea4e48" (UID: "0df23b55-3dea-4f5e-9d53-5c7755ea4e48"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:22:41.289985 master-0 kubenswrapper[7518]: I0319 09:22:41.289973 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:41.289985 master-0 kubenswrapper[7518]: I0319 09:22:41.289989 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de53594-9dcc-4318-806a-64f39ef76b3b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:41.290135 master-0 kubenswrapper[7518]: I0319 09:22:41.290001 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:41.293277 master-0 kubenswrapper[7518]: I0319 09:22:41.293230 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0df23b55-3dea-4f5e-9d53-5c7755ea4e48" (UID: "0df23b55-3dea-4f5e-9d53-5c7755ea4e48"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:22:41.391464 master-0 kubenswrapper[7518]: I0319 09:22:41.391384 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0df23b55-3dea-4f5e-9d53-5c7755ea4e48-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:22:41.787176 master-0 kubenswrapper[7518]: I0319 09:22:41.787034 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_0df23b55-3dea-4f5e-9d53-5c7755ea4e48/installer/0.log" Mar 19 09:22:41.787619 master-0 kubenswrapper[7518]: I0319 09:22:41.787279 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:22:41.805177 master-0 kubenswrapper[7518]: I0319 09:22:41.805121 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_2de53594-9dcc-4318-806a-64f39ef76b3b/installer/0.log" Mar 19 09:22:41.805436 master-0 kubenswrapper[7518]: I0319 09:22:41.805262 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:22:44.100937 master-0 kubenswrapper[7518]: E0319 09:22:44.100771 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 19 09:22:44.140997 master-0 kubenswrapper[7518]: E0319 09:22:44.140738 7518 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw.189e33a75a199783 openshift-kube-storage-version-migrator-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-storage-version-migrator-operator,Name:kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw,UID:a75049de-dcf1-4102-b339-f45d5015adea,APIVersion:v1,ResourceVersion:3673,FieldPath:spec.containers{kube-storage-version-migrator-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:21:37.094735747 +0000 UTC m=+114.977319006,LastTimestamp:2026-03-19 09:21:37.094735747 +0000 UTC m=+114.977319006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:22:51.872040 master-0 kubenswrapper[7518]: I0319 09:22:51.871736 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_dc248e59-1519-4ac3-9005-2239214a8d62/installer/0.log" Mar 19 09:22:51.872040 master-0 kubenswrapper[7518]: I0319 09:22:51.871863 7518 generic.go:334] "Generic (PLEG): container finished" podID="dc248e59-1519-4ac3-9005-2239214a8d62" containerID="2b23049d85d383fc87e2217ac4c88730e6accf178c37b42720c1211cad94765e" exitCode=1 Mar 19 09:22:54.303069 master-0 kubenswrapper[7518]: E0319 09:22:54.302974 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 19 09:22:55.512113 master-0 kubenswrapper[7518]: E0319 09:22:55.511781 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:22:45Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:22:45Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:22:45Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:22:45Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:908eaaf624959bc7645f6d585d160431d1efb070e9a1f37fefed73a3be42b0d3\\\"],\\\"sizeBytes\\\":470681292},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cdd28dfe7132e19af9f013f72cf120d970bc31b6b74693af262f8d2e82a096e1\\\"],\\\"sizeBytes\\\":467235741},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:57.978809 master-0 kubenswrapper[7518]: E0319 09:22:57.978686 7518 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:22:57.978809 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_olm-operator-5c9796789-wjbt2_openshift-operator-lifecycle-manager_8aa0f17a-287e-4a19-9a59-4913e7707071_0(3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c): error adding pod openshift-operator-lifecycle-manager_olm-operator-5c9796789-wjbt2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c" Netns:"/var/run/netns/e08277fa-1304-4e4f-99c1-ee3ce5f3905f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=olm-operator-5c9796789-wjbt2;K8S_POD_INFRA_CONTAINER_ID=3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c;K8S_POD_UID=8aa0f17a-287e-4a19-9a59-4913e7707071" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2] networking: Multus: [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2/8aa0f17a-287e-4a19-9a59-4913e7707071]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: SetNetworkStatus: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/olm-operator-5c9796789-wjbt2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:22:57.978809 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:57.978809 master-0 kubenswrapper[7518]: > Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: E0319 09:22:57.978893 7518 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_olm-operator-5c9796789-wjbt2_openshift-operator-lifecycle-manager_8aa0f17a-287e-4a19-9a59-4913e7707071_0(3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c): error adding pod openshift-operator-lifecycle-manager_olm-operator-5c9796789-wjbt2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c" Netns:"/var/run/netns/e08277fa-1304-4e4f-99c1-ee3ce5f3905f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=olm-operator-5c9796789-wjbt2;K8S_POD_INFRA_CONTAINER_ID=3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c;K8S_POD_UID=8aa0f17a-287e-4a19-9a59-4913e7707071" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2] networking: Multus: [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2/8aa0f17a-287e-4a19-9a59-4913e7707071]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: SetNetworkStatus: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/olm-operator-5c9796789-wjbt2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: > pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: E0319 09:22:57.978929 7518 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_olm-operator-5c9796789-wjbt2_openshift-operator-lifecycle-manager_8aa0f17a-287e-4a19-9a59-4913e7707071_0(3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c): error adding pod openshift-operator-lifecycle-manager_olm-operator-5c9796789-wjbt2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c" Netns:"/var/run/netns/e08277fa-1304-4e4f-99c1-ee3ce5f3905f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=olm-operator-5c9796789-wjbt2;K8S_POD_INFRA_CONTAINER_ID=3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c;K8S_POD_UID=8aa0f17a-287e-4a19-9a59-4913e7707071" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2] networking: Multus: [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2/8aa0f17a-287e-4a19-9a59-4913e7707071]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: SetNetworkStatus: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/olm-operator-5c9796789-wjbt2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: > pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:22:57.979391 master-0 kubenswrapper[7518]: E0319 09:22:57.979071 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"olm-operator-5c9796789-wjbt2_openshift-operator-lifecycle-manager(8aa0f17a-287e-4a19-9a59-4913e7707071)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"olm-operator-5c9796789-wjbt2_openshift-operator-lifecycle-manager(8aa0f17a-287e-4a19-9a59-4913e7707071)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_olm-operator-5c9796789-wjbt2_openshift-operator-lifecycle-manager_8aa0f17a-287e-4a19-9a59-4913e7707071_0(3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c): error adding pod openshift-operator-lifecycle-manager_olm-operator-5c9796789-wjbt2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c\\\" Netns:\\\"/var/run/netns/e08277fa-1304-4e4f-99c1-ee3ce5f3905f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=olm-operator-5c9796789-wjbt2;K8S_POD_INFRA_CONTAINER_ID=3557a1b45dd90816953dc552eea9a193dd5b6c16976411f913644e5838d29b1c;K8S_POD_UID=8aa0f17a-287e-4a19-9a59-4913e7707071\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2] networking: Multus: [openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2/8aa0f17a-287e-4a19-9a59-4913e7707071]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: SetNetworkStatus: failed to update the pod olm-operator-5c9796789-wjbt2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/olm-operator-5c9796789-wjbt2?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" podUID="8aa0f17a-287e-4a19-9a59-4913e7707071" Mar 19 09:23:00.022066 master-0 kubenswrapper[7518]: E0319 09:23:00.021998 7518 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:23:00.022066 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_catalog-operator-68f85b4d6c-j92kd_openshift-operator-lifecycle-manager_208939f5-8fca-4fd5-b0c6-43484b7d1e30_0(5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377): error adding pod openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-j92kd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377" Netns:"/var/run/netns/970b68fd-eea9-46ff-b513-0e04d17e2d4f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=catalog-operator-68f85b4d6c-j92kd;K8S_POD_INFRA_CONTAINER_ID=5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377;K8S_POD_UID=208939f5-8fca-4fd5-b0c6-43484b7d1e30" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd] networking: Multus: [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd/208939f5-8fca-4fd5-b0c6-43484b7d1e30]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: SetNetworkStatus: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/catalog-operator-68f85b4d6c-j92kd?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.022066 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.022066 master-0 kubenswrapper[7518]: > Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: E0319 09:23:00.022122 7518 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_catalog-operator-68f85b4d6c-j92kd_openshift-operator-lifecycle-manager_208939f5-8fca-4fd5-b0c6-43484b7d1e30_0(5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377): error adding pod openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-j92kd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377" Netns:"/var/run/netns/970b68fd-eea9-46ff-b513-0e04d17e2d4f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=catalog-operator-68f85b4d6c-j92kd;K8S_POD_INFRA_CONTAINER_ID=5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377;K8S_POD_UID=208939f5-8fca-4fd5-b0c6-43484b7d1e30" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd] networking: Multus: [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd/208939f5-8fca-4fd5-b0c6-43484b7d1e30]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: SetNetworkStatus: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/catalog-operator-68f85b4d6c-j92kd?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: > pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: E0319 09:23:00.022149 7518 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_catalog-operator-68f85b4d6c-j92kd_openshift-operator-lifecycle-manager_208939f5-8fca-4fd5-b0c6-43484b7d1e30_0(5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377): error adding pod openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-j92kd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377" Netns:"/var/run/netns/970b68fd-eea9-46ff-b513-0e04d17e2d4f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=catalog-operator-68f85b4d6c-j92kd;K8S_POD_INFRA_CONTAINER_ID=5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377;K8S_POD_UID=208939f5-8fca-4fd5-b0c6-43484b7d1e30" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd] networking: Multus: [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd/208939f5-8fca-4fd5-b0c6-43484b7d1e30]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: SetNetworkStatus: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/catalog-operator-68f85b4d6c-j92kd?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: > pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:23:00.022598 master-0 kubenswrapper[7518]: E0319 09:23:00.022235 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"catalog-operator-68f85b4d6c-j92kd_openshift-operator-lifecycle-manager(208939f5-8fca-4fd5-b0c6-43484b7d1e30)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"catalog-operator-68f85b4d6c-j92kd_openshift-operator-lifecycle-manager(208939f5-8fca-4fd5-b0c6-43484b7d1e30)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_catalog-operator-68f85b4d6c-j92kd_openshift-operator-lifecycle-manager_208939f5-8fca-4fd5-b0c6-43484b7d1e30_0(5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377): error adding pod openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-j92kd to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377\\\" Netns:\\\"/var/run/netns/970b68fd-eea9-46ff-b513-0e04d17e2d4f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=catalog-operator-68f85b4d6c-j92kd;K8S_POD_INFRA_CONTAINER_ID=5e6530db9d30bd22d87f35871d24fe1e9b352cc3dd09cce4a1e86c5991a24377;K8S_POD_UID=208939f5-8fca-4fd5-b0c6-43484b7d1e30\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd] networking: Multus: [openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd/208939f5-8fca-4fd5-b0c6-43484b7d1e30]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: SetNetworkStatus: failed to update the pod catalog-operator-68f85b4d6c-j92kd in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/catalog-operator-68f85b4d6c-j92kd?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" podUID="208939f5-8fca-4fd5-b0c6-43484b7d1e30" Mar 19 09:23:00.039806 master-0 kubenswrapper[7518]: E0319 09:23:00.039742 7518 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:23:00.039806 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_package-server-manager-7b95f86987-gltb5_openshift-operator-lifecycle-manager_1f2148fe-f9f6-47da-894c-b88dae360ebe_0(2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a): error adding pod openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-gltb5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a" Netns:"/var/run/netns/712c2c85-ce5d-4530-8545-b2e9c4a3e393" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=package-server-manager-7b95f86987-gltb5;K8S_POD_INFRA_CONTAINER_ID=2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a;K8S_POD_UID=1f2148fe-f9f6-47da-894c-b88dae360ebe" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5] networking: Multus: [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5/1f2148fe-f9f6-47da-894c-b88dae360ebe]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: SetNetworkStatus: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/package-server-manager-7b95f86987-gltb5?timeout=1m0s": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.039806 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.039806 master-0 kubenswrapper[7518]: > Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: E0319 09:23:00.039843 7518 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_package-server-manager-7b95f86987-gltb5_openshift-operator-lifecycle-manager_1f2148fe-f9f6-47da-894c-b88dae360ebe_0(2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a): error adding pod openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-gltb5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a" Netns:"/var/run/netns/712c2c85-ce5d-4530-8545-b2e9c4a3e393" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=package-server-manager-7b95f86987-gltb5;K8S_POD_INFRA_CONTAINER_ID=2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a;K8S_POD_UID=1f2148fe-f9f6-47da-894c-b88dae360ebe" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5] networking: Multus: [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5/1f2148fe-f9f6-47da-894c-b88dae360ebe]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: SetNetworkStatus: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/package-server-manager-7b95f86987-gltb5?timeout=1m0s": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: > pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: E0319 09:23:00.039864 7518 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_package-server-manager-7b95f86987-gltb5_openshift-operator-lifecycle-manager_1f2148fe-f9f6-47da-894c-b88dae360ebe_0(2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a): error adding pod openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-gltb5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a" Netns:"/var/run/netns/712c2c85-ce5d-4530-8545-b2e9c4a3e393" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=package-server-manager-7b95f86987-gltb5;K8S_POD_INFRA_CONTAINER_ID=2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a;K8S_POD_UID=1f2148fe-f9f6-47da-894c-b88dae360ebe" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5] networking: Multus: [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5/1f2148fe-f9f6-47da-894c-b88dae360ebe]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: SetNetworkStatus: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/package-server-manager-7b95f86987-gltb5?timeout=1m0s": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: > pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:23:00.040108 master-0 kubenswrapper[7518]: E0319 09:23:00.039942 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"package-server-manager-7b95f86987-gltb5_openshift-operator-lifecycle-manager(1f2148fe-f9f6-47da-894c-b88dae360ebe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"package-server-manager-7b95f86987-gltb5_openshift-operator-lifecycle-manager(1f2148fe-f9f6-47da-894c-b88dae360ebe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_package-server-manager-7b95f86987-gltb5_openshift-operator-lifecycle-manager_1f2148fe-f9f6-47da-894c-b88dae360ebe_0(2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a): error adding pod openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-gltb5 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a\\\" Netns:\\\"/var/run/netns/712c2c85-ce5d-4530-8545-b2e9c4a3e393\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=package-server-manager-7b95f86987-gltb5;K8S_POD_INFRA_CONTAINER_ID=2112b2c9adb2d0fb0ed222edddd7adc437cbd771b174dd85e078521ab58ddc3a;K8S_POD_UID=1f2148fe-f9f6-47da-894c-b88dae360ebe\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5] networking: Multus: [openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5/1f2148fe-f9f6-47da-894c-b88dae360ebe]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: SetNetworkStatus: failed to update the pod package-server-manager-7b95f86987-gltb5 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/package-server-manager-7b95f86987-gltb5?timeout=1m0s\\\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" podUID="1f2148fe-f9f6-47da-894c-b88dae360ebe" Mar 19 09:23:00.887756 master-0 kubenswrapper[7518]: E0319 09:23:00.887697 7518 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:23:00.887756 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-metrics-daemon-p76jz_openshift-multus_4256d841-23cb-4756-b827-f44ee6e54def_0(af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7): error adding pod openshift-multus_network-metrics-daemon-p76jz to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7" Netns:"/var/run/netns/c0771a82-ab35-44c9-a61c-59fe770379db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-multus;K8S_POD_NAME=network-metrics-daemon-p76jz;K8S_POD_INFRA_CONTAINER_ID=af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7;K8S_POD_UID=4256d841-23cb-4756-b827-f44ee6e54def" Path:"" ERRORED: error configuring pod [openshift-multus/network-metrics-daemon-p76jz] networking: Multus: [openshift-multus/network-metrics-daemon-p76jz/4256d841-23cb-4756-b827-f44ee6e54def]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: SetNetworkStatus: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/pods/network-metrics-daemon-p76jz?timeout=1m0s": context deadline exceeded Mar 19 09:23:00.887756 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.887756 master-0 kubenswrapper[7518]: > Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: E0319 09:23:00.887776 7518 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-metrics-daemon-p76jz_openshift-multus_4256d841-23cb-4756-b827-f44ee6e54def_0(af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7): error adding pod openshift-multus_network-metrics-daemon-p76jz to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7" Netns:"/var/run/netns/c0771a82-ab35-44c9-a61c-59fe770379db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-multus;K8S_POD_NAME=network-metrics-daemon-p76jz;K8S_POD_INFRA_CONTAINER_ID=af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7;K8S_POD_UID=4256d841-23cb-4756-b827-f44ee6e54def" Path:"" ERRORED: error configuring pod [openshift-multus/network-metrics-daemon-p76jz] networking: Multus: [openshift-multus/network-metrics-daemon-p76jz/4256d841-23cb-4756-b827-f44ee6e54def]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: SetNetworkStatus: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/pods/network-metrics-daemon-p76jz?timeout=1m0s": context deadline exceeded Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: > pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: E0319 09:23:00.887801 7518 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-metrics-daemon-p76jz_openshift-multus_4256d841-23cb-4756-b827-f44ee6e54def_0(af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7): error adding pod openshift-multus_network-metrics-daemon-p76jz to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7" Netns:"/var/run/netns/c0771a82-ab35-44c9-a61c-59fe770379db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-multus;K8S_POD_NAME=network-metrics-daemon-p76jz;K8S_POD_INFRA_CONTAINER_ID=af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7;K8S_POD_UID=4256d841-23cb-4756-b827-f44ee6e54def" Path:"" ERRORED: error configuring pod [openshift-multus/network-metrics-daemon-p76jz] networking: Multus: [openshift-multus/network-metrics-daemon-p76jz/4256d841-23cb-4756-b827-f44ee6e54def]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: SetNetworkStatus: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/pods/network-metrics-daemon-p76jz?timeout=1m0s": context deadline exceeded Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: > pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:23:00.888107 master-0 kubenswrapper[7518]: E0319 09:23:00.887882 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"network-metrics-daemon-p76jz_openshift-multus(4256d841-23cb-4756-b827-f44ee6e54def)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"network-metrics-daemon-p76jz_openshift-multus(4256d841-23cb-4756-b827-f44ee6e54def)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-metrics-daemon-p76jz_openshift-multus_4256d841-23cb-4756-b827-f44ee6e54def_0(af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7): error adding pod openshift-multus_network-metrics-daemon-p76jz to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7\\\" Netns:\\\"/var/run/netns/c0771a82-ab35-44c9-a61c-59fe770379db\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-multus;K8S_POD_NAME=network-metrics-daemon-p76jz;K8S_POD_INFRA_CONTAINER_ID=af3a06bf93cd551ce9021bf4a28c6006f61769f4b8b90e164386ba252949d5a7;K8S_POD_UID=4256d841-23cb-4756-b827-f44ee6e54def\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-multus/network-metrics-daemon-p76jz] networking: Multus: [openshift-multus/network-metrics-daemon-p76jz/4256d841-23cb-4756-b827-f44ee6e54def]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: SetNetworkStatus: failed to update the pod network-metrics-daemon-p76jz in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/pods/network-metrics-daemon-p76jz?timeout=1m0s\\\": context deadline exceeded\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-multus/network-metrics-daemon-p76jz" podUID="4256d841-23cb-4756-b827-f44ee6e54def" Mar 19 09:23:00.940972 master-0 kubenswrapper[7518]: I0319 09:23:00.940100 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/0.log" Mar 19 09:23:00.940972 master-0 kubenswrapper[7518]: I0319 09:23:00.940160 7518 generic.go:334] "Generic (PLEG): container finished" podID="6a8e2194-aba6-4929-a29c-47c63c8ff799" containerID="d43b2cecb46ee4d7282d2377662b9eb7bab83399567784e4db2c8496f2616648" exitCode=1 Mar 19 09:23:00.965430 master-0 kubenswrapper[7518]: E0319 09:23:00.965347 7518 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 19 09:23:00.965430 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-monitoring-operator-58845fbb57-z2869_openshift-monitoring_7ad3ef11-90df-40b1-acbf-ed9b0c708ddb_0(60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef): error adding pod openshift-monitoring_cluster-monitoring-operator-58845fbb57-z2869 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef" Netns:"/var/run/netns/ad8edbac-3572-460e-902e-aab48e82c6d8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=cluster-monitoring-operator-58845fbb57-z2869;K8S_POD_INFRA_CONTAINER_ID=60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef;K8S_POD_UID=7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" Path:"" ERRORED: error configuring pod [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869] networking: Multus: [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/cluster-monitoring-operator-58845fbb57-z2869?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.965430 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.965430 master-0 kubenswrapper[7518]: > Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: E0319 09:23:00.965452 7518 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-monitoring-operator-58845fbb57-z2869_openshift-monitoring_7ad3ef11-90df-40b1-acbf-ed9b0c708ddb_0(60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef): error adding pod openshift-monitoring_cluster-monitoring-operator-58845fbb57-z2869 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef" Netns:"/var/run/netns/ad8edbac-3572-460e-902e-aab48e82c6d8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=cluster-monitoring-operator-58845fbb57-z2869;K8S_POD_INFRA_CONTAINER_ID=60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef;K8S_POD_UID=7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" Path:"" ERRORED: error configuring pod [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869] networking: Multus: [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/cluster-monitoring-operator-58845fbb57-z2869?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: > pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: E0319 09:23:00.965496 7518 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-monitoring-operator-58845fbb57-z2869_openshift-monitoring_7ad3ef11-90df-40b1-acbf-ed9b0c708ddb_0(60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef): error adding pod openshift-monitoring_cluster-monitoring-operator-58845fbb57-z2869 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef" Netns:"/var/run/netns/ad8edbac-3572-460e-902e-aab48e82c6d8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=cluster-monitoring-operator-58845fbb57-z2869;K8S_POD_INFRA_CONTAINER_ID=60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef;K8S_POD_UID=7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" Path:"" ERRORED: error configuring pod [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869] networking: Multus: [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/cluster-monitoring-operator-58845fbb57-z2869?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: > pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:23:00.965755 master-0 kubenswrapper[7518]: E0319 09:23:00.965604 7518 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cluster-monitoring-operator-58845fbb57-z2869_openshift-monitoring(7ad3ef11-90df-40b1-acbf-ed9b0c708ddb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cluster-monitoring-operator-58845fbb57-z2869_openshift-monitoring(7ad3ef11-90df-40b1-acbf-ed9b0c708ddb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-monitoring-operator-58845fbb57-z2869_openshift-monitoring_7ad3ef11-90df-40b1-acbf-ed9b0c708ddb_0(60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef): error adding pod openshift-monitoring_cluster-monitoring-operator-58845fbb57-z2869 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef\\\" Netns:\\\"/var/run/netns/ad8edbac-3572-460e-902e-aab48e82c6d8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=cluster-monitoring-operator-58845fbb57-z2869;K8S_POD_INFRA_CONTAINER_ID=60d35db2459cccd128085c6639fe78473206722d81851530fcbceb862473a1ef;K8S_POD_UID=7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869] networking: Multus: [openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-monitoring-operator-58845fbb57-z2869 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/cluster-monitoring-operator-58845fbb57-z2869?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" podUID="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" Mar 19 09:23:02.119248 master-0 kubenswrapper[7518]: I0319 09:23:02.119165 7518 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-ct498 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" start-of-body= Mar 19 09:23:02.119248 master-0 kubenswrapper[7518]: I0319 09:23:02.119246 7518 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" podUID="9663cc40-a69d-42ba-890e-071cb85062f5" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" Mar 19 09:23:04.709061 master-0 kubenswrapper[7518]: E0319 09:23:04.705949 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 19 09:23:05.512980 master-0 kubenswrapper[7518]: E0319 09:23:05.512889 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:23:07.975622 master-0 kubenswrapper[7518]: I0319 09:23:07.975510 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-96qpx_86c4b0e4-3481-465d-b00f-022d2c58c183/openshift-apiserver-operator/1.log" Mar 19 09:23:07.976332 master-0 kubenswrapper[7518]: I0319 09:23:07.976237 7518 generic.go:334] "Generic (PLEG): container finished" podID="86c4b0e4-3481-465d-b00f-022d2c58c183" containerID="d8a756b9b58a3ce072eadde280ccd4f57de1077de86a738e2697b1425743281c" exitCode=255 Mar 19 09:23:07.979052 master-0 kubenswrapper[7518]: I0319 09:23:07.979005 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw_a75049de-dcf1-4102-b339-f45d5015adea/kube-storage-version-migrator-operator/1.log" Mar 19 09:23:07.979588 master-0 kubenswrapper[7518]: I0319 09:23:07.979547 7518 generic.go:334] "Generic (PLEG): container finished" podID="a75049de-dcf1-4102-b339-f45d5015adea" containerID="42b9a79d42542a10355bd1a462df5ffb67f1a10eae7fe6919eb834123087d197" exitCode=255 Mar 19 09:23:14.342679 master-0 kubenswrapper[7518]: E0319 09:23:14.342593 7518 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: E0319 09:23:14.342776 7518 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: I0319 09:23:14.342807 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" event={"ID":"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6","Type":"ContainerDied","Data":"be05318150c766720e5d230c0bf2401720113751ff91aa74d2d72ed4d56c5f47"} Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: I0319 09:23:14.342856 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9"} Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: I0319 09:23:14.342872 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" event={"ID":"5b36f3b2-caf9-40ad-a3a1-e83796142f54","Type":"ContainerDied","Data":"a9e3c64428edfb89f548d2d0f11b93a4546a142c8d9ea26eed5c6670f21e1d16"} Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: I0319 09:23:14.342885 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerDied","Data":"5f66b7b4498be8ffcef1be07d5415ae49ca99cf0c15b74518d97c2537613d5cc"} Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: I0319 09:23:14.343340 7518 scope.go:117] "RemoveContainer" containerID="be05318150c766720e5d230c0bf2401720113751ff91aa74d2d72ed4d56c5f47" Mar 19 09:23:14.343713 master-0 kubenswrapper[7518]: I0319 09:23:14.343440 7518 scope.go:117] "RemoveContainer" containerID="a9e3c64428edfb89f548d2d0f11b93a4546a142c8d9ea26eed5c6670f21e1d16" Mar 19 09:23:14.350245 master-0 kubenswrapper[7518]: I0319 09:23:14.350185 7518 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:23:15.507735 master-0 kubenswrapper[7518]: E0319 09:23:15.507667 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 19 09:23:15.513302 master-0 kubenswrapper[7518]: E0319 09:23:15.513238 7518 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:23:18.144332 master-0 kubenswrapper[7518]: E0319 09:23:18.144204 7518 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-apiserver-operator-d65958b8-96qpx.189e33a75a1aab89 openshift-apiserver-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-apiserver-operator,Name:openshift-apiserver-operator-d65958b8-96qpx,UID:86c4b0e4-3481-465d-b00f-022d2c58c183,APIVersion:v1,ResourceVersion:3688,FieldPath:spec.containers{openshift-apiserver-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:21:37.094806409 +0000 UTC m=+114.977389678,LastTimestamp:2026-03-19 09:21:37.094806409 +0000 UTC m=+114.977389678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:23:20.039193 master-0 kubenswrapper[7518]: I0319 09:23:20.039064 7518 generic.go:334] "Generic (PLEG): container finished" podID="33e92e5d-61ea-45b2-b357-ebffdaebf4af" containerID="e567b2a6970dbbdd6d360830a8ee46fec46945b28639df21bdc4828de4e3065b" exitCode=0 Mar 19 09:23:26.925790 master-0 kubenswrapper[7518]: E0319 09:23:26.925725 7518 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="12.583s" Mar 19 09:23:26.925790 master-0 kubenswrapper[7518]: I0319 09:23:26.925783 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:26.932444 master-0 kubenswrapper[7518]: I0319 09:23:26.932392 7518 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 09:23:26.938657 master-0 kubenswrapper[7518]: I0319 09:23:26.938495 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:23:26.938657 master-0 kubenswrapper[7518]: I0319 09:23:26.938583 7518 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="897e74eb-da26-4e83-bdbc-fda81487ddd0" Mar 19 09:23:26.938657 master-0 kubenswrapper[7518]: I0319 09:23:26.938599 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" event={"ID":"9663cc40-a69d-42ba-890e-071cb85062f5","Type":"ContainerDied","Data":"cdf18d2610050197f807cf4a5fc0308ba6a5aa77b434d76558194e6bb3ba81d0"} Mar 19 09:23:26.938801 master-0 kubenswrapper[7518]: I0319 09:23:26.938704 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:26.938801 master-0 kubenswrapper[7518]: I0319 09:23:26.938717 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566"} Mar 19 09:23:26.938860 master-0 kubenswrapper[7518]: I0319 09:23:26.938810 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:26.938860 master-0 kubenswrapper[7518]: I0319 09:23:26.938825 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2"} Mar 19 09:23:26.938860 master-0 kubenswrapper[7518]: I0319 09:23:26.938834 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af"} Mar 19 09:23:26.938860 master-0 kubenswrapper[7518]: I0319 09:23:26.938846 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6"} Mar 19 09:23:26.938860 master-0 kubenswrapper[7518]: I0319 09:23:26.938857 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938865 7518 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="897e74eb-da26-4e83-bdbc-fda81487ddd0" Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938876 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938886 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938902 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" event={"ID":"a823c8bc-09ef-46a9-a1f3-155a34b89788","Type":"ContainerStarted","Data":"da58fc5924acbda44564bff047cf3106f1fef61de454ef8b44f4d9f01b7d029e"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938913 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" event={"ID":"f216606b-43d0-43d0-a3e3-a3ee2952e7b8","Type":"ContainerStarted","Data":"58eb1efe846e4939933c14b213b2ce11fa5479af85991845d88a34c673fb10f1"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938924 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerStarted","Data":"93d35b0e89d31207bfcd7222380f4dd439dd98f18c6725c9186b6ad660b61c77"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938934 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"8c6bf6e4dc06dc33ce2a60a0abd7d0a106b6973ee1336f65f910e0cb73c9c346"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938944 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"0df23b55-3dea-4f5e-9d53-5c7755ea4e48","Type":"ContainerDied","Data":"0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938955 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194" Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938964 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"2de53594-9dcc-4318-806a-64f39ef76b3b","Type":"ContainerDied","Data":"846174bbc21aaf0dbb6863b67ef55a4060d549089aa7226a91ee6bec43a301c1"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938973 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846174bbc21aaf0dbb6863b67ef55a4060d549089aa7226a91ee6bec43a301c1" Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938982 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"dc248e59-1519-4ac3-9005-2239214a8d62","Type":"ContainerDied","Data":"2b23049d85d383fc87e2217ac4c88730e6accf178c37b42720c1211cad94765e"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.938996 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerDied","Data":"d43b2cecb46ee4d7282d2377662b9eb7bab83399567784e4db2c8496f2616648"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.939007 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" event={"ID":"86c4b0e4-3481-465d-b00f-022d2c58c183","Type":"ContainerDied","Data":"d8a756b9b58a3ce072eadde280ccd4f57de1077de86a738e2697b1425743281c"} Mar 19 09:23:26.939002 master-0 kubenswrapper[7518]: I0319 09:23:26.939021 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" event={"ID":"a75049de-dcf1-4102-b339-f45d5015adea","Type":"ContainerDied","Data":"42b9a79d42542a10355bd1a462df5ffb67f1a10eae7fe6919eb834123087d197"} Mar 19 09:23:26.939445 master-0 kubenswrapper[7518]: I0319 09:23:26.939035 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" event={"ID":"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6","Type":"ContainerStarted","Data":"a506b56a8952771b88a951331f12d1bcc072b7ddce47f17b649a93de89ccfb50"} Mar 19 09:23:26.939445 master-0 kubenswrapper[7518]: I0319 09:23:26.939047 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" event={"ID":"5b36f3b2-caf9-40ad-a3a1-e83796142f54","Type":"ContainerStarted","Data":"8e697a4bef8c9bdae36a875dbe66d5437457b424b0656b719bb2d3fc551a7b7e"} Mar 19 09:23:26.939445 master-0 kubenswrapper[7518]: I0319 09:23:26.939059 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" event={"ID":"33e92e5d-61ea-45b2-b357-ebffdaebf4af","Type":"ContainerDied","Data":"e567b2a6970dbbdd6d360830a8ee46fec46945b28639df21bdc4828de4e3065b"} Mar 19 09:23:26.939620 master-0 kubenswrapper[7518]: I0319 09:23:26.939513 7518 scope.go:117] "RemoveContainer" containerID="e567b2a6970dbbdd6d360830a8ee46fec46945b28639df21bdc4828de4e3065b" Mar 19 09:23:26.940001 master-0 kubenswrapper[7518]: I0319 09:23:26.939971 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:23:26.940607 master-0 kubenswrapper[7518]: I0319 09:23:26.940268 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:23:26.940607 master-0 kubenswrapper[7518]: I0319 09:23:26.940354 7518 scope.go:117] "RemoveContainer" containerID="cdf18d2610050197f807cf4a5fc0308ba6a5aa77b434d76558194e6bb3ba81d0" Mar 19 09:23:26.940607 master-0 kubenswrapper[7518]: I0319 09:23:26.940585 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:23:26.940773 master-0 kubenswrapper[7518]: I0319 09:23:26.940731 7518 scope.go:117] "RemoveContainer" containerID="f432083e0bbefbf0b796c955a8b8a3248de20b6a5a5f87ee1ff2f03234e367ae" Mar 19 09:23:26.940981 master-0 kubenswrapper[7518]: I0319 09:23:26.940937 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:23:26.941039 master-0 kubenswrapper[7518]: I0319 09:23:26.941000 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:23:26.941080 master-0 kubenswrapper[7518]: I0319 09:23:26.941049 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:23:26.941080 master-0 kubenswrapper[7518]: I0319 09:23:26.941074 7518 scope.go:117] "RemoveContainer" containerID="d8a756b9b58a3ce072eadde280ccd4f57de1077de86a738e2697b1425743281c" Mar 19 09:23:26.942081 master-0 kubenswrapper[7518]: I0319 09:23:26.941930 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:23:26.942678 master-0 kubenswrapper[7518]: I0319 09:23:26.942635 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:23:26.945427 master-0 kubenswrapper[7518]: I0319 09:23:26.945411 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:23:26.946343 master-0 kubenswrapper[7518]: I0319 09:23:26.946317 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:23:26.947128 master-0 kubenswrapper[7518]: I0319 09:23:26.947113 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:23:26.947771 master-0 kubenswrapper[7518]: I0319 09:23:26.947758 7518 scope.go:117] "RemoveContainer" containerID="d43b2cecb46ee4d7282d2377662b9eb7bab83399567784e4db2c8496f2616648" Mar 19 09:23:26.948329 master-0 kubenswrapper[7518]: I0319 09:23:26.948317 7518 scope.go:117] "RemoveContainer" containerID="42b9a79d42542a10355bd1a462df5ffb67f1a10eae7fe6919eb834123087d197" Mar 19 09:23:26.951734 master-0 kubenswrapper[7518]: I0319 09:23:26.951696 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 09:23:27.068050 master-0 kubenswrapper[7518]: I0319 09:23:27.067792 7518 scope.go:117] "RemoveContainer" containerID="f771ab2ec3cdd043d42f5957ed84808b36b0f576aa969f9e8666ac7eb9b0b134" Mar 19 09:23:27.226510 master-0 kubenswrapper[7518]: I0319 09:23:27.225998 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" podStartSLOduration=111.634427864 podStartE2EDuration="1m54.225977113s" podCreationTimestamp="2026-03-19 09:21:33 +0000 UTC" firstStartedPulling="2026-03-19 09:21:34.652639698 +0000 UTC m=+112.535222957" lastFinishedPulling="2026-03-19 09:21:37.244188957 +0000 UTC m=+115.126772206" observedRunningTime="2026-03-19 09:23:27.225291325 +0000 UTC m=+225.107874584" watchObservedRunningTime="2026-03-19 09:23:27.225977113 +0000 UTC m=+225.108560372" Mar 19 09:23:27.323219 master-0 kubenswrapper[7518]: I0319 09:23:27.322291 7518 scope.go:117] "RemoveContainer" containerID="239a4aff890f70e77543607e882c4861b3b7d9ef6cf1f395add14a0ad7fc62e0" Mar 19 09:23:27.352674 master-0 kubenswrapper[7518]: E0319 09:23:27.351968 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:27.439056 master-0 kubenswrapper[7518]: I0319 09:23:27.438914 7518 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:23:27.439210 master-0 kubenswrapper[7518]: I0319 09:23:27.439185 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:23:27.629603 master-0 kubenswrapper[7518]: I0319 09:23:27.629564 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_dc248e59-1519-4ac3-9005-2239214a8d62/installer/0.log" Mar 19 09:23:27.629894 master-0 kubenswrapper[7518]: I0319 09:23:27.629648 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:23:27.717354 master-0 kubenswrapper[7518]: I0319 09:23:27.717237 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p76jz"] Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734235 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc248e59-1519-4ac3-9005-2239214a8d62-kube-api-access\") pod \"dc248e59-1519-4ac3-9005-2239214a8d62\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734304 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-kubelet-dir\") pod \"dc248e59-1519-4ac3-9005-2239214a8d62\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734409 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-var-lock\") pod \"dc248e59-1519-4ac3-9005-2239214a8d62\" (UID: \"dc248e59-1519-4ac3-9005-2239214a8d62\") " Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734817 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc248e59-1519-4ac3-9005-2239214a8d62" (UID: "dc248e59-1519-4ac3-9005-2239214a8d62"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734843 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-var-lock" (OuterVolumeSpecName: "var-lock") pod "dc248e59-1519-4ac3-9005-2239214a8d62" (UID: "dc248e59-1519-4ac3-9005-2239214a8d62"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734899 7518 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:27.735307 master-0 kubenswrapper[7518]: I0319 09:23:27.734916 7518 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc248e59-1519-4ac3-9005-2239214a8d62-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:27.740705 master-0 kubenswrapper[7518]: I0319 09:23:27.739082 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869"] Mar 19 09:23:27.740705 master-0 kubenswrapper[7518]: I0319 09:23:27.740650 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc248e59-1519-4ac3-9005-2239214a8d62-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc248e59-1519-4ac3-9005-2239214a8d62" (UID: "dc248e59-1519-4ac3-9005-2239214a8d62"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:23:27.742783 master-0 kubenswrapper[7518]: I0319 09:23:27.742694 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5"] Mar 19 09:23:27.757678 master-0 kubenswrapper[7518]: W0319 09:23:27.755974 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2148fe_f9f6_47da_894c_b88dae360ebe.slice/crio-e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330 WatchSource:0}: Error finding container e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330: Status 404 returned error can't find the container with id e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330 Mar 19 09:23:27.836076 master-0 kubenswrapper[7518]: I0319 09:23:27.836030 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc248e59-1519-4ac3-9005-2239214a8d62-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:23:28.185603 master-0 kubenswrapper[7518]: I0319 09:23:28.185553 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd"] Mar 19 09:23:28.187411 master-0 kubenswrapper[7518]: I0319 09:23:28.187350 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2"] Mar 19 09:23:28.202692 master-0 kubenswrapper[7518]: W0319 09:23:28.202620 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208939f5_8fca_4fd5_b0c6_43484b7d1e30.slice/crio-dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a WatchSource:0}: Error finding container dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a: Status 404 returned error can't find the container with id dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a Mar 19 09:23:28.219333 master-0 kubenswrapper[7518]: I0319 09:23:28.219199 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" event={"ID":"33e92e5d-61ea-45b2-b357-ebffdaebf4af","Type":"ContainerStarted","Data":"bcdb0cf22b96fe48eebdc24abb2d5b2914b32473e5a78be9ded7d96d4faa029e"} Mar 19 09:23:28.220833 master-0 kubenswrapper[7518]: I0319 09:23:28.220807 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:23:28.223052 master-0 kubenswrapper[7518]: I0319 09:23:28.223030 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:23:28.223389 master-0 kubenswrapper[7518]: I0319 09:23:28.223358 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/0.log" Mar 19 09:23:28.223442 master-0 kubenswrapper[7518]: I0319 09:23:28.223420 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerStarted","Data":"46871c9c3ca81cca6462ffff9ccbad93a04486b47c22835c7cced6225bc557cc"} Mar 19 09:23:28.229952 master-0 kubenswrapper[7518]: I0319 09:23:28.229163 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw_a75049de-dcf1-4102-b339-f45d5015adea/kube-storage-version-migrator-operator/1.log" Mar 19 09:23:28.229952 master-0 kubenswrapper[7518]: I0319 09:23:28.229308 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" event={"ID":"a75049de-dcf1-4102-b339-f45d5015adea","Type":"ContainerStarted","Data":"3e4a5f91bbe889960183d49b6c04f975e58d2aaf97e625bbc767490692536ac3"} Mar 19 09:23:28.231752 master-0 kubenswrapper[7518]: I0319 09:23:28.231712 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" event={"ID":"1f2148fe-f9f6-47da-894c-b88dae360ebe","Type":"ContainerStarted","Data":"f055457a8bf487e9762108005fd44c5b439c9059099acab0cc08d9703d2b313f"} Mar 19 09:23:28.231752 master-0 kubenswrapper[7518]: I0319 09:23:28.231747 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" event={"ID":"1f2148fe-f9f6-47da-894c-b88dae360ebe","Type":"ContainerStarted","Data":"e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330"} Mar 19 09:23:28.233530 master-0 kubenswrapper[7518]: I0319 09:23:28.233470 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_dc248e59-1519-4ac3-9005-2239214a8d62/installer/0.log" Mar 19 09:23:28.233623 master-0 kubenswrapper[7518]: I0319 09:23:28.233593 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"dc248e59-1519-4ac3-9005-2239214a8d62","Type":"ContainerDied","Data":"430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82"} Mar 19 09:23:28.233669 master-0 kubenswrapper[7518]: I0319 09:23:28.233630 7518 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82" Mar 19 09:23:28.233669 master-0 kubenswrapper[7518]: I0319 09:23:28.233652 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:23:28.234824 master-0 kubenswrapper[7518]: I0319 09:23:28.234446 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" event={"ID":"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb","Type":"ContainerStarted","Data":"62125637029a850812cbf1a1551ac9bf8a2431cbf9d2111e28185931308bf215"} Mar 19 09:23:28.235256 master-0 kubenswrapper[7518]: I0319 09:23:28.235234 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p76jz" event={"ID":"4256d841-23cb-4756-b827-f44ee6e54def","Type":"ContainerStarted","Data":"3b84ff6dcb01c2864416447de1ea9c58a9ceb02e0ee8e948fe0ed652019990a3"} Mar 19 09:23:28.236814 master-0 kubenswrapper[7518]: I0319 09:23:28.236792 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-96qpx_86c4b0e4-3481-465d-b00f-022d2c58c183/openshift-apiserver-operator/1.log" Mar 19 09:23:28.236877 master-0 kubenswrapper[7518]: I0319 09:23:28.236855 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" event={"ID":"86c4b0e4-3481-465d-b00f-022d2c58c183","Type":"ContainerStarted","Data":"50a5b3fc7bc5457dee2ead0fa55b918d53b9dacde8becaf65f9fc311fe592375"} Mar 19 09:23:28.239641 master-0 kubenswrapper[7518]: I0319 09:23:28.239556 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" event={"ID":"9663cc40-a69d-42ba-890e-071cb85062f5","Type":"ContainerStarted","Data":"ebc9076ad4eac45628e40a9c341c05a16e33d15fb8f08cc5572fe3b72f414123"} Mar 19 09:23:28.243717 master-0 kubenswrapper[7518]: I0319 09:23:28.240719 7518 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-pn5gg container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.40:8081/readyz\": dial tcp 10.128.0.40:8081: connect: connection refused" start-of-body= Mar 19 09:23:28.243717 master-0 kubenswrapper[7518]: I0319 09:23:28.240810 7518 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" podUID="db42b38e-294e-4016-8ac1-54126ac60de8" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.40:8081/readyz\": dial tcp 10.128.0.40:8081: connect: connection refused" Mar 19 09:23:28.243717 master-0 kubenswrapper[7518]: I0319 09:23:28.240873 7518 patch_prober.go:28] interesting pod/operator-controller-controller-manager-57777556ff-pn5gg container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.40:8081/healthz\": dial tcp 10.128.0.40:8081: connect: connection refused" start-of-body= Mar 19 09:23:28.243717 master-0 kubenswrapper[7518]: I0319 09:23:28.240897 7518 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" podUID="db42b38e-294e-4016-8ac1-54126ac60de8" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.40:8081/healthz\": dial tcp 10.128.0.40:8081: connect: connection refused" Mar 19 09:23:28.252581 master-0 kubenswrapper[7518]: I0319 09:23:28.252550 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:28.322228 master-0 kubenswrapper[7518]: I0319 09:23:28.322155 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a807d4-04b4-40ec-b855-5aea08b58bcd" path="/var/lib/kubelet/pods/c7a807d4-04b4-40ec-b855-5aea08b58bcd/volumes" Mar 19 09:23:28.324502 master-0 kubenswrapper[7518]: I0319 09:23:28.324423 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:29.018859 master-0 kubenswrapper[7518]: I0319 09:23:29.018795 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:29.023442 master-0 kubenswrapper[7518]: I0319 09:23:29.023407 7518 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:29.257539 master-0 kubenswrapper[7518]: I0319 09:23:29.255984 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" event={"ID":"208939f5-8fca-4fd5-b0c6-43484b7d1e30","Type":"ContainerStarted","Data":"dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a"} Mar 19 09:23:29.265740 master-0 kubenswrapper[7518]: I0319 09:23:29.265668 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" event={"ID":"8aa0f17a-287e-4a19-9a59-4913e7707071","Type":"ContainerStarted","Data":"4d0ada47f0cb160d98966d63d8a86c801dfaedca21b5932c03647c7678f530ef"} Mar 19 09:23:29.268489 master-0 kubenswrapper[7518]: I0319 09:23:29.268336 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-pn5gg_db42b38e-294e-4016-8ac1-54126ac60de8/manager/0.log" Mar 19 09:23:29.268489 master-0 kubenswrapper[7518]: I0319 09:23:29.268383 7518 generic.go:334] "Generic (PLEG): container finished" podID="db42b38e-294e-4016-8ac1-54126ac60de8" containerID="35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50" exitCode=1 Mar 19 09:23:29.268602 master-0 kubenswrapper[7518]: I0319 09:23:29.268534 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerDied","Data":"35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50"} Mar 19 09:23:29.269132 master-0 kubenswrapper[7518]: I0319 09:23:29.269064 7518 scope.go:117] "RemoveContainer" containerID="35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50" Mar 19 09:23:29.273175 master-0 kubenswrapper[7518]: I0319 09:23:29.273139 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:23:30.276100 master-0 kubenswrapper[7518]: I0319 09:23:30.276052 7518 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-pn5gg_db42b38e-294e-4016-8ac1-54126ac60de8/manager/0.log" Mar 19 09:23:30.276762 master-0 kubenswrapper[7518]: I0319 09:23:30.276132 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerStarted","Data":"8dc1d90f2bef6de1fc7cdb208514e8d0665da95d86860b74cd808de5a4cefbd2"} Mar 19 09:23:30.276762 master-0 kubenswrapper[7518]: I0319 09:23:30.276664 7518 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" containerID="cri-o://35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50" Mar 19 09:23:30.276762 master-0 kubenswrapper[7518]: I0319 09:23:30.276689 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:23:31.281789 master-0 kubenswrapper[7518]: I0319 09:23:31.281735 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:23:32.290709 master-0 kubenswrapper[7518]: I0319 09:23:32.290649 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" event={"ID":"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb","Type":"ContainerStarted","Data":"f12c19567e74d450d6c71f551a0b5df7cd464f82215745c4fd659e90af20475d"} Mar 19 09:23:32.293699 master-0 kubenswrapper[7518]: I0319 09:23:32.293655 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p76jz" event={"ID":"4256d841-23cb-4756-b827-f44ee6e54def","Type":"ContainerStarted","Data":"de66785178983305f851c49aa6db65f412346fb95151e1122b539d195fc71316"} Mar 19 09:23:32.892100 master-0 kubenswrapper[7518]: I0319 09:23:32.892051 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:23:33.302361 master-0 kubenswrapper[7518]: I0319 09:23:33.302291 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p76jz" event={"ID":"4256d841-23cb-4756-b827-f44ee6e54def","Type":"ContainerStarted","Data":"821f1439eb0ef90d3cbd80934d1da6abd10b3ca35d5a742026abdf26202f1351"} Mar 19 09:23:33.621661 master-0 kubenswrapper[7518]: E0319 09:23:33.621506 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 19 09:23:33.655749 master-0 kubenswrapper[7518]: I0319 09:23:33.655497 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.655455174 podStartE2EDuration="1.655455174s" podCreationTimestamp="2026-03-19 09:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:33.640069727 +0000 UTC m=+231.522653006" watchObservedRunningTime="2026-03-19 09:23:33.655455174 +0000 UTC m=+231.538038443" Mar 19 09:23:38.243549 master-0 kubenswrapper[7518]: I0319 09:23:38.243486 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:23:44.357128 master-0 kubenswrapper[7518]: I0319 09:23:44.356976 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" event={"ID":"8aa0f17a-287e-4a19-9a59-4913e7707071","Type":"ContainerStarted","Data":"98ede748fb391e61a9fa27864b391e45ecd330b8f317d785a08e0eb7199be853"} Mar 19 09:23:44.357785 master-0 kubenswrapper[7518]: I0319 09:23:44.357669 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:23:44.362024 master-0 kubenswrapper[7518]: I0319 09:23:44.361978 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:23:45.362987 master-0 kubenswrapper[7518]: I0319 09:23:45.362868 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" event={"ID":"208939f5-8fca-4fd5-b0c6-43484b7d1e30","Type":"ContainerStarted","Data":"fcc4b9662276adf129c1f7ca4f603580c8924fcf6b57ddaef44239b5b092b615"} Mar 19 09:23:45.363453 master-0 kubenswrapper[7518]: I0319 09:23:45.363108 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:23:45.368106 master-0 kubenswrapper[7518]: I0319 09:23:45.368046 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:23:46.370805 master-0 kubenswrapper[7518]: I0319 09:23:46.370743 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" event={"ID":"1f2148fe-f9f6-47da-894c-b88dae360ebe","Type":"ContainerStarted","Data":"9be823979f8f1a6a0c0a163901da50208c4e8126e451a57fb8faa45e85570563"} Mar 19 09:23:46.393726 master-0 kubenswrapper[7518]: I0319 09:23:46.393662 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:23:46.393912 master-0 kubenswrapper[7518]: E0319 09:23:46.393903 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc248e59-1519-4ac3-9005-2239214a8d62" containerName="installer" Mar 19 09:23:46.393951 master-0 kubenswrapper[7518]: I0319 09:23:46.393919 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc248e59-1519-4ac3-9005-2239214a8d62" containerName="installer" Mar 19 09:23:46.393951 master-0 kubenswrapper[7518]: E0319 09:23:46.393936 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a807d4-04b4-40ec-b855-5aea08b58bcd" containerName="installer" Mar 19 09:23:46.393951 master-0 kubenswrapper[7518]: I0319 09:23:46.393946 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a807d4-04b4-40ec-b855-5aea08b58bcd" containerName="installer" Mar 19 09:23:46.394031 master-0 kubenswrapper[7518]: E0319 09:23:46.393962 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerName="installer" Mar 19 09:23:46.394031 master-0 kubenswrapper[7518]: I0319 09:23:46.393973 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerName="installer" Mar 19 09:23:46.394031 master-0 kubenswrapper[7518]: E0319 09:23:46.393989 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerName="installer" Mar 19 09:23:46.394031 master-0 kubenswrapper[7518]: I0319 09:23:46.394000 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerName="installer" Mar 19 09:23:46.394031 master-0 kubenswrapper[7518]: E0319 09:23:46.394014 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerName="installer" Mar 19 09:23:46.394031 master-0 kubenswrapper[7518]: I0319 09:23:46.394023 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerName="installer" Mar 19 09:23:46.394221 master-0 kubenswrapper[7518]: I0319 09:23:46.394131 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc248e59-1519-4ac3-9005-2239214a8d62" containerName="installer" Mar 19 09:23:46.394221 master-0 kubenswrapper[7518]: I0319 09:23:46.394146 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerName="installer" Mar 19 09:23:46.394221 master-0 kubenswrapper[7518]: I0319 09:23:46.394161 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerName="installer" Mar 19 09:23:46.394221 master-0 kubenswrapper[7518]: I0319 09:23:46.394174 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerName="installer" Mar 19 09:23:46.394221 master-0 kubenswrapper[7518]: I0319 09:23:46.394185 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a807d4-04b4-40ec-b855-5aea08b58bcd" containerName="installer" Mar 19 09:23:46.394657 master-0 kubenswrapper[7518]: I0319 09:23:46.394634 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.396540 master-0 kubenswrapper[7518]: I0319 09:23:46.396515 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-w5d24" Mar 19 09:23:46.396901 master-0 kubenswrapper[7518]: I0319 09:23:46.396859 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:23:46.450181 master-0 kubenswrapper[7518]: I0319 09:23:46.450125 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.450401 master-0 kubenswrapper[7518]: I0319 09:23:46.450372 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.450688 master-0 kubenswrapper[7518]: I0319 09:23:46.450619 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.551359 master-0 kubenswrapper[7518]: I0319 09:23:46.551294 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.551359 master-0 kubenswrapper[7518]: I0319 09:23:46.551358 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.551623 master-0 kubenswrapper[7518]: I0319 09:23:46.551433 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.551623 master-0 kubenswrapper[7518]: I0319 09:23:46.551518 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.551732 master-0 kubenswrapper[7518]: I0319 09:23:46.551691 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:46.705746 master-0 kubenswrapper[7518]: I0319 09:23:46.705685 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:23:46.730413 master-0 kubenswrapper[7518]: I0319 09:23:46.730380 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:47.023870 master-0 kubenswrapper[7518]: I0319 09:23:47.023769 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:23:47.563396 master-0 kubenswrapper[7518]: I0319 09:23:47.563329 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wqngb"] Mar 19 09:23:47.564649 master-0 kubenswrapper[7518]: I0319 09:23:47.564615 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.565571 master-0 kubenswrapper[7518]: I0319 09:23:47.565502 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tkx45"] Mar 19 09:23:47.566810 master-0 kubenswrapper[7518]: I0319 09:23:47.566731 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.570021 master-0 kubenswrapper[7518]: I0319 09:23:47.569364 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vp2s5" Mar 19 09:23:47.570021 master-0 kubenswrapper[7518]: I0319 09:23:47.569698 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-bvdqs" Mar 19 09:23:47.602491 master-0 kubenswrapper[7518]: I0319 09:23:47.602323 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 09:23:47.610858 master-0 kubenswrapper[7518]: I0319 09:23:47.609943 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqngb"] Mar 19 09:23:47.623444 master-0 kubenswrapper[7518]: I0319 09:23:47.623362 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tkx45"] Mar 19 09:23:47.766495 master-0 kubenswrapper[7518]: I0319 09:23:47.766414 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-catalog-content\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.766495 master-0 kubenswrapper[7518]: I0319 09:23:47.766498 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txp58\" (UniqueName: \"kubernetes.io/projected/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-kube-api-access-txp58\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.766801 master-0 kubenswrapper[7518]: I0319 09:23:47.766532 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-utilities\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.766801 master-0 kubenswrapper[7518]: I0319 09:23:47.766586 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-catalog-content\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.766801 master-0 kubenswrapper[7518]: I0319 09:23:47.766618 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n49x9\" (UniqueName: \"kubernetes.io/projected/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-kube-api-access-n49x9\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.766801 master-0 kubenswrapper[7518]: I0319 09:23:47.766654 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-utilities\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.868288 master-0 kubenswrapper[7518]: I0319 09:23:47.868216 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-utilities\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.868288 master-0 kubenswrapper[7518]: I0319 09:23:47.868287 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-catalog-content\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.868517 master-0 kubenswrapper[7518]: I0319 09:23:47.868323 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txp58\" (UniqueName: \"kubernetes.io/projected/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-kube-api-access-txp58\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.868517 master-0 kubenswrapper[7518]: I0319 09:23:47.868352 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-utilities\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.868517 master-0 kubenswrapper[7518]: I0319 09:23:47.868416 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-catalog-content\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.868517 master-0 kubenswrapper[7518]: I0319 09:23:47.868451 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n49x9\" (UniqueName: \"kubernetes.io/projected/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-kube-api-access-n49x9\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.869261 master-0 kubenswrapper[7518]: I0319 09:23:47.869233 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-utilities\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.869459 master-0 kubenswrapper[7518]: I0319 09:23:47.869434 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-utilities\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.869569 master-0 kubenswrapper[7518]: I0319 09:23:47.869485 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-catalog-content\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.869793 master-0 kubenswrapper[7518]: I0319 09:23:47.869763 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-catalog-content\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.920235 master-0 kubenswrapper[7518]: I0319 09:23:47.920191 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n49x9\" (UniqueName: \"kubernetes.io/projected/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-kube-api-access-n49x9\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:47.921438 master-0 kubenswrapper[7518]: I0319 09:23:47.921412 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txp58\" (UniqueName: \"kubernetes.io/projected/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-kube-api-access-txp58\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:47.970039 master-0 kubenswrapper[7518]: I0319 09:23:47.969948 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:23:48.199542 master-0 kubenswrapper[7518]: I0319 09:23:48.199489 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:23:48.400541 master-0 kubenswrapper[7518]: I0319 09:23:48.400096 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"89be0036-a2c8-48b4-9eaf-17fab972c4f4","Type":"ContainerStarted","Data":"5d59e82ae91c2ed1c8a992bffe58e7eea15792d208f2b71cb72f5ee7bff4f994"} Mar 19 09:23:48.400541 master-0 kubenswrapper[7518]: I0319 09:23:48.400158 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"89be0036-a2c8-48b4-9eaf-17fab972c4f4","Type":"ContainerStarted","Data":"58d5b64552b14fa37f1c4ade1890dfcbcf78def52cdf495457e904377a1b0a43"} Mar 19 09:23:48.482563 master-0 kubenswrapper[7518]: I0319 09:23:48.469816 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tkx45"] Mar 19 09:23:48.617566 master-0 kubenswrapper[7518]: I0319 09:23:48.605449 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.605425526 podStartE2EDuration="2.605425526s" podCreationTimestamp="2026-03-19 09:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:48.485228982 +0000 UTC m=+246.367812251" watchObservedRunningTime="2026-03-19 09:23:48.605425526 +0000 UTC m=+246.488008795" Mar 19 09:23:48.617566 master-0 kubenswrapper[7518]: I0319 09:23:48.606869 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5"] Mar 19 09:23:48.626818 master-0 kubenswrapper[7518]: I0319 09:23:48.623924 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:48.627858 master-0 kubenswrapper[7518]: I0319 09:23:48.627822 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:23:48.628117 master-0 kubenswrapper[7518]: I0319 09:23:48.628086 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xvfqf" Mar 19 09:23:48.628322 master-0 kubenswrapper[7518]: I0319 09:23:48.628308 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.628500 master-0 kubenswrapper[7518]: I0319 09:23:48.628340 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:23:48.645523 master-0 kubenswrapper[7518]: I0319 09:23:48.643798 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp"] Mar 19 09:23:48.651551 master-0 kubenswrapper[7518]: I0319 09:23:48.647529 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.651885 master-0 kubenswrapper[7518]: I0319 09:23:48.651846 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2"] Mar 19 09:23:48.652885 master-0 kubenswrapper[7518]: I0319 09:23:48.652851 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-zsf7l" Mar 19 09:23:48.652997 master-0 kubenswrapper[7518]: I0319 09:23:48.652961 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:23:48.653072 master-0 kubenswrapper[7518]: I0319 09:23:48.653054 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.658027 master-0 kubenswrapper[7518]: I0319 09:23:48.653911 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:23:48.658597 master-0 kubenswrapper[7518]: I0319 09:23:48.658571 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:23:48.658936 master-0 kubenswrapper[7518]: I0319 09:23:48.658922 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:23:48.659282 master-0 kubenswrapper[7518]: I0319 09:23:48.659265 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.659536 master-0 kubenswrapper[7518]: I0319 09:23:48.659521 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-mdr74" Mar 19 09:23:48.667500 master-0 kubenswrapper[7518]: I0319 09:23:48.659761 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:23:48.667871 master-0 kubenswrapper[7518]: I0319 09:23:48.665483 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-wshz8"] Mar 19 09:23:48.668040 master-0 kubenswrapper[7518]: I0319 09:23:48.665077 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:23:48.668176 master-0 kubenswrapper[7518]: I0319 09:23:48.665228 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.669686 master-0 kubenswrapper[7518]: I0319 09:23:48.669665 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv"] Mar 19 09:23:48.680002 master-0 kubenswrapper[7518]: I0319 09:23:48.670394 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.680002 master-0 kubenswrapper[7518]: I0319 09:23:48.670423 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:48.682542 master-0 kubenswrapper[7518]: I0319 09:23:48.682478 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5"] Mar 19 09:23:48.686531 master-0 kubenswrapper[7518]: I0319 09:23:48.684097 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2"] Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.691409 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.692103 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.692658 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-llwk7" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.695275 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.695621 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.697055 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.697315 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:23:48.698838 master-0 kubenswrapper[7518]: I0319 09:23:48.697857 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-r6z7f" Mar 19 09:23:48.712515 master-0 kubenswrapper[7518]: I0319 09:23:48.706259 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:23:48.739184 master-0 kubenswrapper[7518]: I0319 09:23:48.739053 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-wshz8"] Mar 19 09:23:48.754129 master-0 kubenswrapper[7518]: I0319 09:23:48.754065 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv"] Mar 19 09:23:48.760815 master-0 kubenswrapper[7518]: I0319 09:23:48.758319 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5"] Mar 19 09:23:48.760815 master-0 kubenswrapper[7518]: I0319 09:23:48.760069 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.763724 master-0 kubenswrapper[7518]: I0319 09:23:48.763282 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4"] Mar 19 09:23:48.765146 master-0 kubenswrapper[7518]: I0319 09:23:48.764417 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:23:48.765146 master-0 kubenswrapper[7518]: I0319 09:23:48.764623 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xcbjl" Mar 19 09:23:48.765146 master-0 kubenswrapper[7518]: I0319 09:23:48.764635 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:23:48.765146 master-0 kubenswrapper[7518]: I0319 09:23:48.764739 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:23:48.765146 master-0 kubenswrapper[7518]: I0319 09:23:48.764869 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:23:48.765146 master-0 kubenswrapper[7518]: I0319 09:23:48.764937 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:48.765442 master-0 kubenswrapper[7518]: I0319 09:23:48.765237 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g"] Mar 19 09:23:48.767251 master-0 kubenswrapper[7518]: I0319 09:23:48.766531 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.767935 master-0 kubenswrapper[7518]: I0319 09:23:48.767308 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx"] Mar 19 09:23:48.767935 master-0 kubenswrapper[7518]: I0319 09:23:48.767546 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:23:48.772946 master-0 kubenswrapper[7518]: I0319 09:23:48.771247 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5"] Mar 19 09:23:48.772946 master-0 kubenswrapper[7518]: I0319 09:23:48.771411 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.772946 master-0 kubenswrapper[7518]: I0319 09:23:48.772071 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4"] Mar 19 09:23:48.777922 master-0 kubenswrapper[7518]: I0319 09:23:48.777556 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqngb"] Mar 19 09:23:48.781460 master-0 kubenswrapper[7518]: I0319 09:23:48.781039 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:23:48.781460 master-0 kubenswrapper[7518]: I0319 09:23:48.781385 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-pr68p" Mar 19 09:23:48.781713 master-0 kubenswrapper[7518]: I0319 09:23:48.781661 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:23:48.781764 master-0 kubenswrapper[7518]: I0319 09:23:48.781715 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:23:48.781908 master-0 kubenswrapper[7518]: I0319 09:23:48.781859 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wkbj2" Mar 19 09:23:48.782160 master-0 kubenswrapper[7518]: I0319 09:23:48.782107 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:23:48.782240 master-0 kubenswrapper[7518]: I0319 09:23:48.782206 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.787921 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.787976 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788008 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788039 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788081 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz7d6\" (UniqueName: \"kubernetes.io/projected/31742478-0d83-48cf-b38b-02416d95d4a8-kube-api-access-wz7d6\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788108 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws5kr\" (UniqueName: \"kubernetes.io/projected/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-kube-api-access-ws5kr\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788138 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788166 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788191 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788213 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0cb70a30-a8d1-4037-81e6-eb4f0510a234-snapshots\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788233 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhwd7\" (UniqueName: \"kubernetes.io/projected/141cb120-92da-4d8d-bc29-fc4c433a6336-kube-api-access-fhwd7\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788258 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788283 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb70a30-a8d1-4037-81e6-eb4f0510a234-serving-cert\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788304 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7x89\" (UniqueName: \"kubernetes.io/projected/0cb70a30-a8d1-4037-81e6-eb4f0510a234-kube-api-access-q7x89\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788335 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788364 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788386 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.788707 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.789167 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.789402 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.789586 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zxmm6" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.789718 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:23:48.790088 master-0 kubenswrapper[7518]: I0319 09:23:48.790011 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx"] Mar 19 09:23:48.794712 master-0 kubenswrapper[7518]: I0319 09:23:48.792510 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g"] Mar 19 09:23:48.878455 master-0 kubenswrapper[7518]: I0319 09:23:48.869876 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wzz6n"] Mar 19 09:23:48.878455 master-0 kubenswrapper[7518]: I0319 09:23:48.871258 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:48.878455 master-0 kubenswrapper[7518]: I0319 09:23:48.874108 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6s584" Mar 19 09:23:48.889886 master-0 kubenswrapper[7518]: I0319 09:23:48.889834 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:48.889975 master-0 kubenswrapper[7518]: I0319 09:23:48.889909 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt22g\" (UniqueName: \"kubernetes.io/projected/9ca444a4-4d78-456f-9656-0c28076ce77e-kube-api-access-kt22g\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.889975 master-0 kubenswrapper[7518]: I0319 09:23:48.889947 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.890038 master-0 kubenswrapper[7518]: I0319 09:23:48.889987 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.890073 master-0 kubenswrapper[7518]: I0319 09:23:48.890040 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.890117 master-0 kubenswrapper[7518]: I0319 09:23:48.890068 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.890148 master-0 kubenswrapper[7518]: I0319 09:23:48.890115 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.890182 master-0 kubenswrapper[7518]: I0319 09:23:48.890147 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.890466 master-0 kubenswrapper[7518]: I0319 09:23:48.890426 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:48.890536 master-0 kubenswrapper[7518]: I0319 09:23:48.890491 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdb6\" (UniqueName: \"kubernetes.io/projected/cd42096c-f18d-4bb5-8a51-8761dc1edb73-kube-api-access-dxdb6\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.890674 master-0 kubenswrapper[7518]: I0319 09:23:48.890642 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz7d6\" (UniqueName: \"kubernetes.io/projected/31742478-0d83-48cf-b38b-02416d95d4a8-kube-api-access-wz7d6\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:48.890712 master-0 kubenswrapper[7518]: I0319 09:23:48.890686 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.890747 master-0 kubenswrapper[7518]: I0319 09:23:48.890720 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.890846 master-0 kubenswrapper[7518]: I0319 09:23:48.890806 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws5kr\" (UniqueName: \"kubernetes.io/projected/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-kube-api-access-ws5kr\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.890898 master-0 kubenswrapper[7518]: I0319 09:23:48.890870 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.890930 master-0 kubenswrapper[7518]: I0319 09:23:48.890910 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.891041 master-0 kubenswrapper[7518]: I0319 09:23:48.890989 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfw5k\" (UniqueName: \"kubernetes.io/projected/f93b8728-4a33-4ee4-b7c6-cff7d7995953-kube-api-access-kfw5k\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.891089 master-0 kubenswrapper[7518]: I0319 09:23:48.891032 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.891123 master-0 kubenswrapper[7518]: I0319 09:23:48.891058 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891169 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891226 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891268 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0cb70a30-a8d1-4037-81e6-eb4f0510a234-snapshots\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891294 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhwd7\" (UniqueName: \"kubernetes.io/projected/141cb120-92da-4d8d-bc29-fc4c433a6336-kube-api-access-fhwd7\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891632 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891702 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb70a30-a8d1-4037-81e6-eb4f0510a234-serving-cert\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891735 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7x89\" (UniqueName: \"kubernetes.io/projected/0cb70a30-a8d1-4037-81e6-eb4f0510a234-kube-api-access-q7x89\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891804 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891847 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891886 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891918 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891952 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvp9m\" (UniqueName: \"kubernetes.io/projected/d32541c9-eef6-417c-9f5a-a7392dc70aa0-kube-api-access-fvp9m\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.891982 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.892029 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.892050 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.892115 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0cb70a30-a8d1-4037-81e6-eb4f0510a234-snapshots\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.892982 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.893186 master-0 kubenswrapper[7518]: I0319 09:23:48.892999 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.894190 master-0 kubenswrapper[7518]: I0319 09:23:48.894159 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.894275 master-0 kubenswrapper[7518]: I0319 09:23:48.894159 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.894784 master-0 kubenswrapper[7518]: I0319 09:23:48.894756 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.894877 master-0 kubenswrapper[7518]: I0319 09:23:48.894846 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.895274 master-0 kubenswrapper[7518]: I0319 09:23:48.895246 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:48.896245 master-0 kubenswrapper[7518]: I0319 09:23:48.896210 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:48.896328 master-0 kubenswrapper[7518]: I0319 09:23:48.896293 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:48.896553 master-0 kubenswrapper[7518]: I0319 09:23:48.896520 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb70a30-a8d1-4037-81e6-eb4f0510a234-serving-cert\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:48.971970 master-0 kubenswrapper[7518]: I0319 09:23:48.971896 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzz6n"] Mar 19 09:23:48.983430 master-0 kubenswrapper[7518]: I0319 09:23:48.974761 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:48.993877 master-0 kubenswrapper[7518]: I0319 09:23:48.993839 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:48.994005 master-0 kubenswrapper[7518]: I0319 09:23:48.993890 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt22g\" (UniqueName: \"kubernetes.io/projected/9ca444a4-4d78-456f-9656-0c28076ce77e-kube-api-access-kt22g\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.994005 master-0 kubenswrapper[7518]: I0319 09:23:48.993923 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.994095 master-0 kubenswrapper[7518]: I0319 09:23:48.993996 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rltcj\" (UniqueName: \"kubernetes.io/projected/39bf78ac-304b-4b82-8729-d184657ef3bb-kube-api-access-rltcj\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:48.994977 master-0 kubenswrapper[7518]: I0319 09:23:48.994897 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:48.995021 master-0 kubenswrapper[7518]: I0319 09:23:48.994988 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.995084 master-0 kubenswrapper[7518]: I0319 09:23:48.995061 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdb6\" (UniqueName: \"kubernetes.io/projected/cd42096c-f18d-4bb5-8a51-8761dc1edb73-kube-api-access-dxdb6\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.995159 master-0 kubenswrapper[7518]: I0319 09:23:48.995141 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:48.995203 master-0 kubenswrapper[7518]: I0319 09:23:48.995168 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.995247 master-0 kubenswrapper[7518]: I0319 09:23:48.995227 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.995280 master-0 kubenswrapper[7518]: I0319 09:23:48.995261 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfw5k\" (UniqueName: \"kubernetes.io/projected/f93b8728-4a33-4ee4-b7c6-cff7d7995953-kube-api-access-kfw5k\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:48.995341 master-0 kubenswrapper[7518]: I0319 09:23:48.995324 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:48.995522 master-0 kubenswrapper[7518]: I0319 09:23:48.995445 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-catalog-content\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.001914 master-0 kubenswrapper[7518]: I0319 09:23:48.999693 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhwd7\" (UniqueName: \"kubernetes.io/projected/141cb120-92da-4d8d-bc29-fc4c433a6336-kube-api-access-fhwd7\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:49.001914 master-0 kubenswrapper[7518]: I0319 09:23:49.001101 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:49.002072 master-0 kubenswrapper[7518]: I0319 09:23:48.997415 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7x89\" (UniqueName: \"kubernetes.io/projected/0cb70a30-a8d1-4037-81e6-eb4f0510a234-kube-api-access-q7x89\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:49.002211 master-0 kubenswrapper[7518]: I0319 09:23:49.002171 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws5kr\" (UniqueName: \"kubernetes.io/projected/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-kube-api-access-ws5kr\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:48.999130 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:49.002783 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvp9m\" (UniqueName: \"kubernetes.io/projected/d32541c9-eef6-417c-9f5a-a7392dc70aa0-kube-api-access-fvp9m\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:49.002824 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-utilities\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:49.002851 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:49.002929 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:49.002960 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.004059 master-0 kubenswrapper[7518]: I0319 09:23:49.000782 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:49.022572 master-0 kubenswrapper[7518]: I0319 09:23:49.001494 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz7d6\" (UniqueName: \"kubernetes.io/projected/31742478-0d83-48cf-b38b-02416d95d4a8-kube-api-access-wz7d6\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:49.022572 master-0 kubenswrapper[7518]: I0319 09:23:49.020140 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.022572 master-0 kubenswrapper[7518]: I0319 09:23:49.020746 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.022572 master-0 kubenswrapper[7518]: I0319 09:23:49.021794 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:49.022572 master-0 kubenswrapper[7518]: I0319 09:23:49.022288 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.023642 master-0 kubenswrapper[7518]: I0319 09:23:49.023073 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.023686 master-0 kubenswrapper[7518]: I0319 09:23:49.023649 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j"] Mar 19 09:23:49.027492 master-0 kubenswrapper[7518]: I0319 09:23:49.024832 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.027492 master-0 kubenswrapper[7518]: I0319 09:23:49.026843 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt22g\" (UniqueName: \"kubernetes.io/projected/9ca444a4-4d78-456f-9656-0c28076ce77e-kube-api-access-kt22g\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.027492 master-0 kubenswrapper[7518]: I0319 09:23:49.027189 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.027730 master-0 kubenswrapper[7518]: I0319 09:23:49.027685 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:49.035577 master-0 kubenswrapper[7518]: I0319 09:23:49.032892 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.035577 master-0 kubenswrapper[7518]: I0319 09:23:49.033819 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.035909 master-0 kubenswrapper[7518]: I0319 09:23:49.035752 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j"] Mar 19 09:23:49.038800 master-0 kubenswrapper[7518]: I0319 09:23:49.038619 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-jtdpn" Mar 19 09:23:49.039939 master-0 kubenswrapper[7518]: I0319 09:23:49.039056 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:23:49.045542 master-0 kubenswrapper[7518]: I0319 09:23:49.043187 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:23:49.060527 master-0 kubenswrapper[7518]: I0319 09:23:49.057093 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:49.068591 master-0 kubenswrapper[7518]: I0319 09:23:49.066035 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfw5k\" (UniqueName: \"kubernetes.io/projected/f93b8728-4a33-4ee4-b7c6-cff7d7995953-kube-api-access-kfw5k\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:49.068591 master-0 kubenswrapper[7518]: I0319 09:23:49.066870 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvp9m\" (UniqueName: \"kubernetes.io/projected/d32541c9-eef6-417c-9f5a-a7392dc70aa0-kube-api-access-fvp9m\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:49.075187 master-0 kubenswrapper[7518]: I0319 09:23:49.075136 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdb6\" (UniqueName: \"kubernetes.io/projected/cd42096c-f18d-4bb5-8a51-8761dc1edb73-kube-api-access-dxdb6\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.113517 master-0 kubenswrapper[7518]: I0319 09:23:49.110694 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-catalog-content\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.113517 master-0 kubenswrapper[7518]: I0319 09:23:49.111437 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-catalog-content\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.113517 master-0 kubenswrapper[7518]: I0319 09:23:49.111527 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-utilities\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.113517 master-0 kubenswrapper[7518]: I0319 09:23:49.111626 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rltcj\" (UniqueName: \"kubernetes.io/projected/39bf78ac-304b-4b82-8729-d184657ef3bb-kube-api-access-rltcj\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.113517 master-0 kubenswrapper[7518]: I0319 09:23:49.112691 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-utilities\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.118546 master-0 kubenswrapper[7518]: I0319 09:23:49.114967 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:23:49.138334 master-0 kubenswrapper[7518]: I0319 09:23:49.138270 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:23:49.149333 master-0 kubenswrapper[7518]: I0319 09:23:49.148564 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:23:49.179859 master-0 kubenswrapper[7518]: I0319 09:23:49.177828 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:23:49.190397 master-0 kubenswrapper[7518]: I0319 09:23:49.190351 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rltcj\" (UniqueName: \"kubernetes.io/projected/39bf78ac-304b-4b82-8729-d184657ef3bb-kube-api-access-rltcj\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.214871 master-0 kubenswrapper[7518]: I0319 09:23:49.213419 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.214871 master-0 kubenswrapper[7518]: I0319 09:23:49.213600 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f0d16aa2-494d-4a65-880d-3d87219178b5-tmpfs\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.214871 master-0 kubenswrapper[7518]: I0319 09:23:49.213651 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.214871 master-0 kubenswrapper[7518]: I0319 09:23:49.213717 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsdjh\" (UniqueName: \"kubernetes.io/projected/f0d16aa2-494d-4a65-880d-3d87219178b5-kube-api-access-fsdjh\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.226734 master-0 kubenswrapper[7518]: I0319 09:23:49.226578 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:23:49.268375 master-0 kubenswrapper[7518]: I0319 09:23:49.267936 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:23:49.287358 master-0 kubenswrapper[7518]: I0319 09:23:49.287305 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:23:49.295166 master-0 kubenswrapper[7518]: I0319 09:23:49.290927 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:23:49.297523 master-0 kubenswrapper[7518]: I0319 09:23:49.297117 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:23:49.322744 master-0 kubenswrapper[7518]: I0319 09:23:49.317357 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f0d16aa2-494d-4a65-880d-3d87219178b5-tmpfs\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.322744 master-0 kubenswrapper[7518]: I0319 09:23:49.317460 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.322744 master-0 kubenswrapper[7518]: I0319 09:23:49.317506 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsdjh\" (UniqueName: \"kubernetes.io/projected/f0d16aa2-494d-4a65-880d-3d87219178b5-kube-api-access-fsdjh\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.322744 master-0 kubenswrapper[7518]: I0319 09:23:49.317595 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.322744 master-0 kubenswrapper[7518]: I0319 09:23:49.318792 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:23:49.322744 master-0 kubenswrapper[7518]: I0319 09:23:49.319562 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f0d16aa2-494d-4a65-880d-3d87219178b5-tmpfs\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.323363 master-0 kubenswrapper[7518]: I0319 09:23:49.323329 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.334298 master-0 kubenswrapper[7518]: I0319 09:23:49.334205 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.381039 master-0 kubenswrapper[7518]: I0319 09:23:49.380722 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsdjh\" (UniqueName: \"kubernetes.io/projected/f0d16aa2-494d-4a65-880d-3d87219178b5-kube-api-access-fsdjh\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.437285 master-0 kubenswrapper[7518]: I0319 09:23:49.436049 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" event={"ID":"ce38ec35-8f00-4060-a620-1759a6bbef66","Type":"ContainerStarted","Data":"633fccc65fe5856fecc01dbcc7e58f5190eed4eb98e5e73385a0e9bcc0746e0e"} Mar 19 09:23:49.452420 master-0 kubenswrapper[7518]: I0319 09:23:49.452222 7518 generic.go:334] "Generic (PLEG): container finished" podID="dd69fc33-59d4-4538-b4ec-e2d08ac11f72" containerID="8fa2aedcd94c8a914c06f3267aec5df548ae61bbfade5d0ba8f849928a4839e1" exitCode=0 Mar 19 09:23:49.452420 master-0 kubenswrapper[7518]: I0319 09:23:49.452311 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx45" event={"ID":"dd69fc33-59d4-4538-b4ec-e2d08ac11f72","Type":"ContainerDied","Data":"8fa2aedcd94c8a914c06f3267aec5df548ae61bbfade5d0ba8f849928a4839e1"} Mar 19 09:23:49.452420 master-0 kubenswrapper[7518]: I0319 09:23:49.452342 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx45" event={"ID":"dd69fc33-59d4-4538-b4ec-e2d08ac11f72","Type":"ContainerStarted","Data":"e4c8b86557bfc322b9f1b1feea17aaefaf34263c41685f5164a347ec08c589e8"} Mar 19 09:23:49.458057 master-0 kubenswrapper[7518]: I0319 09:23:49.458022 7518 generic.go:334] "Generic (PLEG): container finished" podID="89b0e82c-1cd1-45aa-9cab-2d11320a1ff7" containerID="cf86a9f840243b51077e44de7146e420b0ec2bfabf64c651e8c74a472013cdb5" exitCode=0 Mar 19 09:23:49.458891 master-0 kubenswrapper[7518]: I0319 09:23:49.458773 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqngb" event={"ID":"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7","Type":"ContainerDied","Data":"cf86a9f840243b51077e44de7146e420b0ec2bfabf64c651e8c74a472013cdb5"} Mar 19 09:23:49.458891 master-0 kubenswrapper[7518]: I0319 09:23:49.458808 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqngb" event={"ID":"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7","Type":"ContainerStarted","Data":"6b70c6219cee771d6e858549f53b5dbf8004794c49061a1d0481404af45e4772"} Mar 19 09:23:49.638863 master-0 kubenswrapper[7518]: I0319 09:23:49.638710 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:49.682201 master-0 kubenswrapper[7518]: I0319 09:23:49.682152 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5"] Mar 19 09:23:49.682569 master-0 kubenswrapper[7518]: W0319 09:23:49.682507 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca444a4_4d78_456f_9656_0c28076ce77e.slice/crio-1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe WatchSource:0}: Error finding container 1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe: Status 404 returned error can't find the container with id 1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe Mar 19 09:23:49.841267 master-0 kubenswrapper[7518]: I0319 09:23:49.841217 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2"] Mar 19 09:23:49.849101 master-0 kubenswrapper[7518]: W0319 09:23:49.849058 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a16f6f_437c_4da5_a797_287e5e1ddbd4.slice/crio-60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c WatchSource:0}: Error finding container 60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c: Status 404 returned error can't find the container with id 60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c Mar 19 09:23:49.917003 master-0 kubenswrapper[7518]: W0319 09:23:49.916218 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd42096c_f18d_4bb5_8a51_8761dc1edb73.slice/crio-140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25 WatchSource:0}: Error finding container 140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25: Status 404 returned error can't find the container with id 140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25 Mar 19 09:23:49.917274 master-0 kubenswrapper[7518]: I0319 09:23:49.917048 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx"] Mar 19 09:23:49.919334 master-0 kubenswrapper[7518]: W0319 09:23:49.918768 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd32541c9_eef6_417c_9f5a_a7392dc70aa0.slice/crio-ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc WatchSource:0}: Error finding container ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc: Status 404 returned error can't find the container with id ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc Mar 19 09:23:49.955974 master-0 kubenswrapper[7518]: I0319 09:23:49.948173 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4"] Mar 19 09:23:49.958810 master-0 kubenswrapper[7518]: I0319 09:23:49.958624 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g"] Mar 19 09:23:49.970296 master-0 kubenswrapper[7518]: I0319 09:23:49.968514 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-wshz8"] Mar 19 09:23:49.997405 master-0 kubenswrapper[7518]: I0319 09:23:49.996818 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zpvpd"] Mar 19 09:23:50.008425 master-0 kubenswrapper[7518]: W0319 09:23:50.008390 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93b8728_4a33_4ee4_b7c6_cff7d7995953.slice/crio-0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600 WatchSource:0}: Error finding container 0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600: Status 404 returned error can't find the container with id 0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600 Mar 19 09:23:50.011844 master-0 kubenswrapper[7518]: I0319 09:23:50.010239 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpvpd"] Mar 19 09:23:50.011844 master-0 kubenswrapper[7518]: I0319 09:23:50.010494 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.024178 master-0 kubenswrapper[7518]: I0319 09:23:50.014750 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-t6dfg" Mar 19 09:23:50.116629 master-0 kubenswrapper[7518]: I0319 09:23:50.116579 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5"] Mar 19 09:23:50.129750 master-0 kubenswrapper[7518]: I0319 09:23:50.129702 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-utilities\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.129885 master-0 kubenswrapper[7518]: I0319 09:23:50.129799 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-catalog-content\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.129885 master-0 kubenswrapper[7518]: I0319 09:23:50.129828 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxfs\" (UniqueName: \"kubernetes.io/projected/f1943401-a75b-4e45-8c65-3cc36018d8c4-kube-api-access-8cxfs\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.171343 master-0 kubenswrapper[7518]: I0319 09:23:50.168651 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzz6n"] Mar 19 09:23:50.236107 master-0 kubenswrapper[7518]: I0319 09:23:50.236058 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-catalog-content\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.236407 master-0 kubenswrapper[7518]: I0319 09:23:50.236132 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxfs\" (UniqueName: \"kubernetes.io/projected/f1943401-a75b-4e45-8c65-3cc36018d8c4-kube-api-access-8cxfs\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.236407 master-0 kubenswrapper[7518]: I0319 09:23:50.236189 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-utilities\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.242938 master-0 kubenswrapper[7518]: I0319 09:23:50.242907 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-catalog-content\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.243072 master-0 kubenswrapper[7518]: I0319 09:23:50.242960 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-utilities\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.280956 master-0 kubenswrapper[7518]: I0319 09:23:50.280892 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv"] Mar 19 09:23:50.284346 master-0 kubenswrapper[7518]: I0319 09:23:50.284315 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxfs\" (UniqueName: \"kubernetes.io/projected/f1943401-a75b-4e45-8c65-3cc36018d8c4-kube-api-access-8cxfs\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.400900 master-0 kubenswrapper[7518]: I0319 09:23:50.400797 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j"] Mar 19 09:23:50.462094 master-0 kubenswrapper[7518]: W0319 09:23:50.462040 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d16aa2_494d_4a65_880d_3d87219178b5.slice/crio-9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209 WatchSource:0}: Error finding container 9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209: Status 404 returned error can't find the container with id 9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209 Mar 19 09:23:50.474903 master-0 kubenswrapper[7518]: I0319 09:23:50.474797 7518 generic.go:334] "Generic (PLEG): container finished" podID="39bf78ac-304b-4b82-8729-d184657ef3bb" containerID="833893fe28da658713401cd9bbaf4ee6973b0a664d7435b398ca89d99977b122" exitCode=0 Mar 19 09:23:50.474994 master-0 kubenswrapper[7518]: I0319 09:23:50.474897 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzz6n" event={"ID":"39bf78ac-304b-4b82-8729-d184657ef3bb","Type":"ContainerDied","Data":"833893fe28da658713401cd9bbaf4ee6973b0a664d7435b398ca89d99977b122"} Mar 19 09:23:50.474994 master-0 kubenswrapper[7518]: I0319 09:23:50.474930 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzz6n" event={"ID":"39bf78ac-304b-4b82-8729-d184657ef3bb","Type":"ContainerStarted","Data":"91645cb3e01b7383bf3c741eaf023e22432d4ae51de307f2749e304f203b0c13"} Mar 19 09:23:50.476268 master-0 kubenswrapper[7518]: I0319 09:23:50.476120 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerStarted","Data":"5ae9a109908423db1f7d35a526931ed2af44da77833edc3112c7f12de82644eb"} Mar 19 09:23:50.477719 master-0 kubenswrapper[7518]: I0319 09:23:50.477361 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" event={"ID":"31742478-0d83-48cf-b38b-02416d95d4a8","Type":"ContainerStarted","Data":"e4c44a8a218f4d3a8bf81e0a2e78942dceac9d2b2c4c60ba4ca23a60c107ed3b"} Mar 19 09:23:50.478740 master-0 kubenswrapper[7518]: I0319 09:23:50.478393 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerStarted","Data":"140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25"} Mar 19 09:23:50.480041 master-0 kubenswrapper[7518]: I0319 09:23:50.479852 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" event={"ID":"c2a16f6f-437c-4da5-a797-287e5e1ddbd4","Type":"ContainerStarted","Data":"3a5057c4663a21aceca2b02ed0a7abbc37ae2e2713660d933566b28e9aef6d45"} Mar 19 09:23:50.480041 master-0 kubenswrapper[7518]: I0319 09:23:50.479883 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" event={"ID":"c2a16f6f-437c-4da5-a797-287e5e1ddbd4","Type":"ContainerStarted","Data":"60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c"} Mar 19 09:23:50.480041 master-0 kubenswrapper[7518]: I0319 09:23:50.480033 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:23:50.482213 master-0 kubenswrapper[7518]: I0319 09:23:50.482171 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" event={"ID":"d32541c9-eef6-417c-9f5a-a7392dc70aa0","Type":"ContainerStarted","Data":"6edb54451814dd1a0b69a728692c547cb530f94076165a3f995d004b3eac6073"} Mar 19 09:23:50.482213 master-0 kubenswrapper[7518]: I0319 09:23:50.482209 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" event={"ID":"d32541c9-eef6-417c-9f5a-a7392dc70aa0","Type":"ContainerStarted","Data":"ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc"} Mar 19 09:23:50.484753 master-0 kubenswrapper[7518]: I0319 09:23:50.484616 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" event={"ID":"f93b8728-4a33-4ee4-b7c6-cff7d7995953","Type":"ContainerStarted","Data":"9f1ef2cb225eccd803b02b7cf6059c207b67d11faba6471ed6eec3f7aa0adfc4"} Mar 19 09:23:50.484753 master-0 kubenswrapper[7518]: I0319 09:23:50.484677 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" event={"ID":"f93b8728-4a33-4ee4-b7c6-cff7d7995953","Type":"ContainerStarted","Data":"0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600"} Mar 19 09:23:50.486144 master-0 kubenswrapper[7518]: I0319 09:23:50.486123 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" event={"ID":"141cb120-92da-4d8d-bc29-fc4c433a6336","Type":"ContainerStarted","Data":"f64ca30d2cf598d32dbab617a0a172e7aa2a1cb9512109dd3142530e06881cb4"} Mar 19 09:23:50.488257 master-0 kubenswrapper[7518]: I0319 09:23:50.488204 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" event={"ID":"9ca444a4-4d78-456f-9656-0c28076ce77e","Type":"ContainerStarted","Data":"960f32c6d06fce8fbbe541c2eb240023e700e8f662141847dde4b08561087e00"} Mar 19 09:23:50.488257 master-0 kubenswrapper[7518]: I0319 09:23:50.488244 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" event={"ID":"9ca444a4-4d78-456f-9656-0c28076ce77e","Type":"ContainerStarted","Data":"244e27730aebd49c60a5ec53fb9cadbda7494bcda85162b5e4911d771296a665"} Mar 19 09:23:50.488257 master-0 kubenswrapper[7518]: I0319 09:23:50.488257 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" event={"ID":"9ca444a4-4d78-456f-9656-0c28076ce77e","Type":"ContainerStarted","Data":"1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe"} Mar 19 09:23:50.518858 master-0 kubenswrapper[7518]: I0319 09:23:50.518366 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" podStartSLOduration=2.5183468700000002 podStartE2EDuration="2.51834687s" podCreationTimestamp="2026-03-19 09:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:50.514730935 +0000 UTC m=+248.397314194" watchObservedRunningTime="2026-03-19 09:23:50.51834687 +0000 UTC m=+248.400930149" Mar 19 09:23:50.961779 master-0 kubenswrapper[7518]: I0319 09:23:50.961641 7518 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpvpd"] Mar 19 09:23:50.968086 master-0 kubenswrapper[7518]: W0319 09:23:50.968019 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1943401_a75b_4e45_8c65_3cc36018d8c4.slice/crio-1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e WatchSource:0}: Error finding container 1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e: Status 404 returned error can't find the container with id 1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e Mar 19 09:23:51.498167 master-0 kubenswrapper[7518]: I0319 09:23:51.498115 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" event={"ID":"f0d16aa2-494d-4a65-880d-3d87219178b5","Type":"ContainerStarted","Data":"ea2d0feda466ad5cf7cb97fbb7d3de21f570dd4f8225f53dfbe5225a1c32359b"} Mar 19 09:23:51.498167 master-0 kubenswrapper[7518]: I0319 09:23:51.498164 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" event={"ID":"f0d16aa2-494d-4a65-880d-3d87219178b5","Type":"ContainerStarted","Data":"9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209"} Mar 19 09:23:51.498550 master-0 kubenswrapper[7518]: I0319 09:23:51.498474 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:51.500793 master-0 kubenswrapper[7518]: I0319 09:23:51.500758 7518 generic.go:334] "Generic (PLEG): container finished" podID="f1943401-a75b-4e45-8c65-3cc36018d8c4" containerID="ac22f5d33f89532c3f245e5d78a3e4b4931118bf3ea5e137f52cf13514162a71" exitCode=0 Mar 19 09:23:51.500981 master-0 kubenswrapper[7518]: I0319 09:23:51.500937 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpvpd" event={"ID":"f1943401-a75b-4e45-8c65-3cc36018d8c4","Type":"ContainerDied","Data":"ac22f5d33f89532c3f245e5d78a3e4b4931118bf3ea5e137f52cf13514162a71"} Mar 19 09:23:51.500981 master-0 kubenswrapper[7518]: I0319 09:23:51.500969 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpvpd" event={"ID":"f1943401-a75b-4e45-8c65-3cc36018d8c4","Type":"ContainerStarted","Data":"1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e"} Mar 19 09:23:51.503721 master-0 kubenswrapper[7518]: I0319 09:23:51.503700 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:23:51.542325 master-0 kubenswrapper[7518]: I0319 09:23:51.542223 7518 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" podStartSLOduration=3.54219863 podStartE2EDuration="3.54219863s" podCreationTimestamp="2026-03-19 09:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:23:51.535058842 +0000 UTC m=+249.417642131" watchObservedRunningTime="2026-03-19 09:23:51.54219863 +0000 UTC m=+249.424781889" Mar 19 09:23:53.756244 master-0 kubenswrapper[7518]: I0319 09:23:53.756154 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hgc52"] Mar 19 09:23:53.757392 master-0 kubenswrapper[7518]: I0319 09:23:53.757371 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:53.759331 master-0 kubenswrapper[7518]: I0319 09:23:53.759281 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-j66zv" Mar 19 09:23:53.759331 master-0 kubenswrapper[7518]: I0319 09:23:53.759311 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:23:53.919906 master-0 kubenswrapper[7518]: I0319 09:23:53.919849 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxtcq\" (UniqueName: \"kubernetes.io/projected/467c2f01-2c23-41e2-acb9-08a84061fefc-kube-api-access-mxtcq\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:53.920206 master-0 kubenswrapper[7518]: I0319 09:23:53.919918 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/467c2f01-2c23-41e2-acb9-08a84061fefc-rootfs\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:53.920206 master-0 kubenswrapper[7518]: I0319 09:23:53.919996 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:53.920206 master-0 kubenswrapper[7518]: I0319 09:23:53.920060 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.022058 master-0 kubenswrapper[7518]: I0319 09:23:54.021941 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtcq\" (UniqueName: \"kubernetes.io/projected/467c2f01-2c23-41e2-acb9-08a84061fefc-kube-api-access-mxtcq\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.022058 master-0 kubenswrapper[7518]: I0319 09:23:54.022011 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/467c2f01-2c23-41e2-acb9-08a84061fefc-rootfs\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.022058 master-0 kubenswrapper[7518]: I0319 09:23:54.022049 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.022329 master-0 kubenswrapper[7518]: I0319 09:23:54.022082 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.022329 master-0 kubenswrapper[7518]: I0319 09:23:54.022301 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/467c2f01-2c23-41e2-acb9-08a84061fefc-rootfs\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.023008 master-0 kubenswrapper[7518]: I0319 09:23:54.022976 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.035747 master-0 kubenswrapper[7518]: I0319 09:23:54.030433 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.046813 master-0 kubenswrapper[7518]: I0319 09:23:54.046758 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxtcq\" (UniqueName: \"kubernetes.io/projected/467c2f01-2c23-41e2-acb9-08a84061fefc-kube-api-access-mxtcq\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:54.078090 master-0 kubenswrapper[7518]: I0319 09:23:54.075572 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:23:59.845672 master-0 kubenswrapper[7518]: I0319 09:23:59.845029 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh"] Mar 19 09:23:59.846650 master-0 kubenswrapper[7518]: I0319 09:23:59.846317 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="machine-approver-controller" containerID="cri-o://8ea09204714320987ee497184ecc0341387802c177649f71cb1059afb0240745" gracePeriod=30 Mar 19 09:23:59.846720 master-0 kubenswrapper[7518]: I0319 09:23:59.846694 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="kube-rbac-proxy" containerID="cri-o://1cf12cc7445333b9b5a115e74969e46b1dc4dba3d931d8d5860393a5791b239a" gracePeriod=30 Mar 19 09:24:03.579777 master-0 kubenswrapper[7518]: I0319 09:24:03.579714 7518 generic.go:334] "Generic (PLEG): container finished" podID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerID="8ea09204714320987ee497184ecc0341387802c177649f71cb1059afb0240745" exitCode=0 Mar 19 09:24:03.579777 master-0 kubenswrapper[7518]: I0319 09:24:03.579758 7518 generic.go:334] "Generic (PLEG): container finished" podID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerID="1cf12cc7445333b9b5a115e74969e46b1dc4dba3d931d8d5860393a5791b239a" exitCode=0 Mar 19 09:24:03.580281 master-0 kubenswrapper[7518]: I0319 09:24:03.579794 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" event={"ID":"de509e3d-5e9c-47be-bce2-adc4f435aea8","Type":"ContainerDied","Data":"8ea09204714320987ee497184ecc0341387802c177649f71cb1059afb0240745"} Mar 19 09:24:03.580281 master-0 kubenswrapper[7518]: I0319 09:24:03.579846 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" event={"ID":"de509e3d-5e9c-47be-bce2-adc4f435aea8","Type":"ContainerDied","Data":"1cf12cc7445333b9b5a115e74969e46b1dc4dba3d931d8d5860393a5791b239a"} Mar 19 09:24:16.843374 master-0 kubenswrapper[7518]: I0319 09:24:16.843331 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp"] Mar 19 09:24:19.323766 master-0 kubenswrapper[7518]: I0319 09:24:19.323722 7518 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:24:32.195755 master-0 kubenswrapper[7518]: I0319 09:24:32.195699 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:24:32.279965 master-0 kubenswrapper[7518]: I0319 09:24:32.279895 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-config\") pod \"de509e3d-5e9c-47be-bce2-adc4f435aea8\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " Mar 19 09:24:32.279965 master-0 kubenswrapper[7518]: I0319 09:24:32.279941 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-auth-proxy-config\") pod \"de509e3d-5e9c-47be-bce2-adc4f435aea8\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " Mar 19 09:24:32.280221 master-0 kubenswrapper[7518]: I0319 09:24:32.279994 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/de509e3d-5e9c-47be-bce2-adc4f435aea8-machine-approver-tls\") pod \"de509e3d-5e9c-47be-bce2-adc4f435aea8\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " Mar 19 09:24:32.280221 master-0 kubenswrapper[7518]: I0319 09:24:32.280064 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggm26\" (UniqueName: \"kubernetes.io/projected/de509e3d-5e9c-47be-bce2-adc4f435aea8-kube-api-access-ggm26\") pod \"de509e3d-5e9c-47be-bce2-adc4f435aea8\" (UID: \"de509e3d-5e9c-47be-bce2-adc4f435aea8\") " Mar 19 09:24:32.280353 master-0 kubenswrapper[7518]: I0319 09:24:32.280318 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-config" (OuterVolumeSpecName: "config") pod "de509e3d-5e9c-47be-bce2-adc4f435aea8" (UID: "de509e3d-5e9c-47be-bce2-adc4f435aea8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:32.280844 master-0 kubenswrapper[7518]: I0319 09:24:32.280783 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "de509e3d-5e9c-47be-bce2-adc4f435aea8" (UID: "de509e3d-5e9c-47be-bce2-adc4f435aea8"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:32.282871 master-0 kubenswrapper[7518]: I0319 09:24:32.282831 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de509e3d-5e9c-47be-bce2-adc4f435aea8-kube-api-access-ggm26" (OuterVolumeSpecName: "kube-api-access-ggm26") pod "de509e3d-5e9c-47be-bce2-adc4f435aea8" (UID: "de509e3d-5e9c-47be-bce2-adc4f435aea8"). InnerVolumeSpecName "kube-api-access-ggm26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:32.283740 master-0 kubenswrapper[7518]: I0319 09:24:32.283607 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de509e3d-5e9c-47be-bce2-adc4f435aea8-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "de509e3d-5e9c-47be-bce2-adc4f435aea8" (UID: "de509e3d-5e9c-47be-bce2-adc4f435aea8"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:32.382438 master-0 kubenswrapper[7518]: I0319 09:24:32.382350 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:32.382438 master-0 kubenswrapper[7518]: I0319 09:24:32.382410 7518 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de509e3d-5e9c-47be-bce2-adc4f435aea8-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:32.382438 master-0 kubenswrapper[7518]: I0319 09:24:32.382432 7518 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/de509e3d-5e9c-47be-bce2-adc4f435aea8-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:32.382438 master-0 kubenswrapper[7518]: I0319 09:24:32.382449 7518 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggm26\" (UniqueName: \"kubernetes.io/projected/de509e3d-5e9c-47be-bce2-adc4f435aea8-kube-api-access-ggm26\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:32.769546 master-0 kubenswrapper[7518]: I0319 09:24:32.769357 7518 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" event={"ID":"de509e3d-5e9c-47be-bce2-adc4f435aea8","Type":"ContainerDied","Data":"b6f21e047d7fe1c17012e8b0e2eccf0c0df41f1dd7af47ee16ae785f35047af4"} Mar 19 09:24:32.769546 master-0 kubenswrapper[7518]: I0319 09:24:32.769550 7518 scope.go:117] "RemoveContainer" containerID="8ea09204714320987ee497184ecc0341387802c177649f71cb1059afb0240745" Mar 19 09:24:32.769546 master-0 kubenswrapper[7518]: I0319 09:24:32.769549 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh" Mar 19 09:24:33.306727 master-0 kubenswrapper[7518]: I0319 09:24:33.306644 7518 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh"] Mar 19 09:24:33.310093 master-0 kubenswrapper[7518]: I0319 09:24:33.310038 7518 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-qkbqh"] Mar 19 09:24:33.381702 master-0 kubenswrapper[7518]: I0319 09:24:33.381637 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5"] Mar 19 09:24:33.384242 master-0 kubenswrapper[7518]: E0319 09:24:33.384171 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="kube-rbac-proxy" Mar 19 09:24:33.384242 master-0 kubenswrapper[7518]: I0319 09:24:33.384206 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="kube-rbac-proxy" Mar 19 09:24:33.384242 master-0 kubenswrapper[7518]: E0319 09:24:33.384223 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="machine-approver-controller" Mar 19 09:24:33.384242 master-0 kubenswrapper[7518]: I0319 09:24:33.384232 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="machine-approver-controller" Mar 19 09:24:33.384840 master-0 kubenswrapper[7518]: I0319 09:24:33.384394 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="machine-approver-controller" Mar 19 09:24:33.384840 master-0 kubenswrapper[7518]: I0319 09:24:33.384425 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" containerName="kube-rbac-proxy" Mar 19 09:24:33.386951 master-0 kubenswrapper[7518]: I0319 09:24:33.386918 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.392709 master-0 kubenswrapper[7518]: I0319 09:24:33.392666 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:24:33.393762 master-0 kubenswrapper[7518]: I0319 09:24:33.393720 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:24:33.394037 master-0 kubenswrapper[7518]: I0319 09:24:33.393830 7518 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-l9t78" Mar 19 09:24:33.394037 master-0 kubenswrapper[7518]: I0319 09:24:33.393765 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:24:33.395544 master-0 kubenswrapper[7518]: I0319 09:24:33.395502 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:24:33.396406 master-0 kubenswrapper[7518]: I0319 09:24:33.396371 7518 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:24:33.412572 master-0 kubenswrapper[7518]: I0319 09:24:33.410680 7518 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:24:33.412572 master-0 kubenswrapper[7518]: I0319 09:24:33.411438 7518 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:24:33.412572 master-0 kubenswrapper[7518]: I0319 09:24:33.411691 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" containerID="cri-o://7716e20f21898d48a97cdc11ca530decd4b56cabb9557337c593d6dc0a3abe47" gracePeriod=15 Mar 19 09:24:33.412572 master-0 kubenswrapper[7518]: I0319 09:24:33.411854 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.412572 master-0 kubenswrapper[7518]: I0319 09:24:33.411919 7518 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://157ec68d28f9ad49e7460cf4325702e32a61a87e98a342a6b3f00e830966c9b0" gracePeriod=15 Mar 19 09:24:33.412572 master-0 kubenswrapper[7518]: I0319 09:24:33.412399 7518 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:24:33.413266 master-0 kubenswrapper[7518]: E0319 09:24:33.412777 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 09:24:33.413266 master-0 kubenswrapper[7518]: I0319 09:24:33.412795 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 09:24:33.413266 master-0 kubenswrapper[7518]: E0319 09:24:33.412810 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 09:24:33.413266 master-0 kubenswrapper[7518]: I0319 09:24:33.412817 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 09:24:33.413266 master-0 kubenswrapper[7518]: E0319 09:24:33.412829 7518 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:24:33.413266 master-0 kubenswrapper[7518]: I0319 09:24:33.412837 7518 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:24:33.413505 master-0 kubenswrapper[7518]: I0319 09:24:33.413063 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 09:24:33.413505 master-0 kubenswrapper[7518]: I0319 09:24:33.413403 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:24:33.413505 master-0 kubenswrapper[7518]: I0319 09:24:33.413414 7518 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 09:24:33.420143 master-0 kubenswrapper[7518]: I0319 09:24:33.419563 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.507794 master-0 kubenswrapper[7518]: I0319 09:24:33.507673 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.507794 master-0 kubenswrapper[7518]: I0319 09:24:33.507740 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.507794 master-0 kubenswrapper[7518]: I0319 09:24:33.507791 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.507821 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.507855 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.507887 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.507915 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.507937 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.507974 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.508002 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.508033 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.508100 master-0 kubenswrapper[7518]: I0319 09:24:33.508058 7518 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.514299 master-0 kubenswrapper[7518]: I0319 09:24:33.514259 7518 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:24:33.543583 master-0 kubenswrapper[7518]: E0319 09:24:33.543529 7518 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.609199 master-0 kubenswrapper[7518]: I0319 09:24:33.609058 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.609199 master-0 kubenswrapper[7518]: I0319 09:24:33.609151 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.609199 master-0 kubenswrapper[7518]: I0319 09:24:33.609178 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609215 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609249 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609282 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609302 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609322 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609348 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609368 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609386 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.609449 master-0 kubenswrapper[7518]: I0319 09:24:33.609415 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.609749 master-0 kubenswrapper[7518]: I0319 09:24:33.609511 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.609749 master-0 kubenswrapper[7518]: I0319 09:24:33.609581 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.609749 master-0 kubenswrapper[7518]: I0319 09:24:33.609607 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.610496 master-0 kubenswrapper[7518]: I0319 09:24:33.610455 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.611113 master-0 kubenswrapper[7518]: I0319 09:24:33.611073 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.611165 master-0 kubenswrapper[7518]: I0319 09:24:33.611138 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.611214 master-0 kubenswrapper[7518]: I0319 09:24:33.611197 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.611246 master-0 kubenswrapper[7518]: I0319 09:24:33.611232 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:33.611887 master-0 kubenswrapper[7518]: I0319 09:24:33.611842 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.612301 master-0 kubenswrapper[7518]: E0319 09:24:33.612180 7518 projected.go:194] Error preparing data for projected volume kube-api-access-h4wht for pod openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:33.612301 master-0 kubenswrapper[7518]: E0319 09:24:33.612287 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:34.112263536 +0000 UTC m=+291.994846795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h4wht" (UniqueName: "kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:33.612685 master-0 kubenswrapper[7518]: I0319 09:24:33.612657 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.612945 master-0 kubenswrapper[7518]: E0319 09:24:33.612812 7518 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-approver-5c6485487f-cscz5.189e33d0735cff62 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-5c6485487f-cscz5,UID:dea35f60-33be-4ccc-b985-952eac3a85c0,APIVersion:v1,ResourceVersion:10561,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-h4wht\" : failed to fetch token: Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token\": dial tcp 192.168.32.10:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:24:33.612242786 +0000 UTC m=+291.494826045,LastTimestamp:2026-03-19 09:24:33.612242786 +0000 UTC m=+291.494826045,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:24:33.614169 master-0 kubenswrapper[7518]: I0319 09:24:33.614139 7518 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:33.806032 master-0 kubenswrapper[7518]: I0319 09:24:33.805949 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:33.844822 master-0 kubenswrapper[7518]: I0319 09:24:33.844729 7518 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:34.117564 master-0 kubenswrapper[7518]: I0319 09:24:34.117086 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:34.118870 master-0 kubenswrapper[7518]: E0319 09:24:34.118840 7518 projected.go:194] Error preparing data for projected volume kube-api-access-h4wht for pod openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:34.118958 master-0 kubenswrapper[7518]: E0319 09:24:34.118945 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:35.118915873 +0000 UTC m=+293.001499142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h4wht" (UniqueName: "kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:34.322494 master-0 kubenswrapper[7518]: I0319 09:24:34.322384 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de509e3d-5e9c-47be-bce2-adc4f435aea8" path="/var/lib/kubelet/pods/de509e3d-5e9c-47be-bce2-adc4f435aea8/volumes" Mar 19 09:24:35.130925 master-0 kubenswrapper[7518]: I0319 09:24:35.130800 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:35.132242 master-0 kubenswrapper[7518]: E0319 09:24:35.132174 7518 projected.go:194] Error preparing data for projected volume kube-api-access-h4wht for pod openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:35.132377 master-0 kubenswrapper[7518]: E0319 09:24:35.132266 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:37.132244986 +0000 UTC m=+295.014828245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h4wht" (UniqueName: "kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:35.804502 master-0 kubenswrapper[7518]: E0319 09:24:35.804400 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:35.805225 master-0 kubenswrapper[7518]: E0319 09:24:35.805187 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:35.805878 master-0 kubenswrapper[7518]: E0319 09:24:35.805834 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:35.806378 master-0 kubenswrapper[7518]: E0319 09:24:35.806346 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:35.806815 master-0 kubenswrapper[7518]: E0319 09:24:35.806772 7518 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:35.806815 master-0 kubenswrapper[7518]: I0319 09:24:35.806801 7518 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:24:35.807431 master-0 kubenswrapper[7518]: E0319 09:24:35.807381 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:24:36.009002 master-0 kubenswrapper[7518]: E0319 09:24:36.008930 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:24:36.410564 master-0 kubenswrapper[7518]: E0319 09:24:36.410392 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:24:37.211374 master-0 kubenswrapper[7518]: E0319 09:24:37.211303 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:24:37.218497 master-0 kubenswrapper[7518]: I0319 09:24:37.218420 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:37.219387 master-0 kubenswrapper[7518]: E0319 09:24:37.219357 7518 projected.go:194] Error preparing data for projected volume kube-api-access-h4wht for pod openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:37.219486 master-0 kubenswrapper[7518]: E0319 09:24:37.219435 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:41.219409503 +0000 UTC m=+299.101992762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h4wht" (UniqueName: "kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:38.813150 master-0 kubenswrapper[7518]: E0319 09:24:38.813088 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:24:39.587600 master-0 kubenswrapper[7518]: E0319 09:24:39.587348 7518 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-approver-5c6485487f-cscz5.189e33d0735cff62 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-5c6485487f-cscz5,UID:dea35f60-33be-4ccc-b985-952eac3a85c0,APIVersion:v1,ResourceVersion:10561,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-h4wht\" : failed to fetch token: Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token\": dial tcp 192.168.32.10:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:24:33.612242786 +0000 UTC m=+291.494826045,LastTimestamp:2026-03-19 09:24:33.612242786 +0000 UTC m=+291.494826045,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:24:41.294309 master-0 kubenswrapper[7518]: I0319 09:24:41.294246 7518 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:24:41.295135 master-0 kubenswrapper[7518]: E0319 09:24:41.295100 7518 projected.go:194] Error preparing data for projected volume kube-api-access-h4wht for pod openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:41.295207 master-0 kubenswrapper[7518]: E0319 09:24:41.295167 7518 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:49.295149053 +0000 UTC m=+307.177732312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h4wht" (UniqueName: "kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:42.015222 master-0 kubenswrapper[7518]: E0319 09:24:42.015152 7518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:24:42.111114 master-0 kubenswrapper[7518]: I0319 09:24:42.111065 7518 scope.go:117] "RemoveContainer" containerID="1cf12cc7445333b9b5a115e74969e46b1dc4dba3d931d8d5860393a5791b239a" Mar 19 09:24:42.209913 master-0 kubenswrapper[7518]: W0319 09:24:42.209848 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467c2f01_2c23_41e2_acb9_08a84061fefc.slice/crio-4316d2211dbd015d08696560a401855badad02d5162bd18e5f9b36a4aa80b6a7 WatchSource:0}: Error finding container 4316d2211dbd015d08696560a401855badad02d5162bd18e5f9b36a4aa80b6a7: Status 404 returned error can't find the container with id 4316d2211dbd015d08696560a401855badad02d5162bd18e5f9b36a4aa80b6a7 Mar 19 09:24:42.224704 master-0 kubenswrapper[7518]: I0319 09:24:42.224648 7518 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:24:42.225422 master-0 kubenswrapper[7518]: I0319 09:24:42.225379 7518 status_manager.go:851] "Failed to get status for pod" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:42.260032 master-0 kubenswrapper[7518]: W0319 09:24:42.259994 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1511182fa3564db9f50c25912cc22f.slice/crio-44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb WatchSource:0}: Error finding container 44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb: Status 404 returned error can't find the container with id 44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb Mar 19 09:24:42.262353 master-0 kubenswrapper[7518]: W0319 09:24:42.262323 7518 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4801b7b4c9bb4aca19f4e1af1002ed5d.slice/crio-ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d WatchSource:0}: Error finding container ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d: Status 404 returned error can't find the container with id ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d Mar 19 09:24:42.308445 master-0 kubenswrapper[7518]: I0319 09:24:42.308411 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308462 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308516 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308548 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308627 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308655 7518 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308765 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets" (OuterVolumeSpecName: "secrets") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308800 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308930 7518 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308940 7518 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308957 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config" (OuterVolumeSpecName: "config") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308973 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.308990 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:42.317959 master-0 kubenswrapper[7518]: I0319 09:24:42.309246 7518 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs" (OuterVolumeSpecName: "logs") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:24:42.320834 master-0 kubenswrapper[7518]: I0319 09:24:42.320788 7518 status_manager.go:851] "Failed to get status for pod" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:24:42.322326 master-0 kubenswrapper[7518]: I0319 09:24:42.322292 7518 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fac1b46a11e49501805e891baae4a9" path="/var/lib/kubelet/pods/49fac1b46a11e49501805e891baae4a9/volumes" Mar 19 09:24:42.322880 master-0 kubenswrapper[7518]: I0319 09:24:42.322860 7518 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 09:24:42.324186 master-0 kubenswrapper[7518]: E0319 09:24:42.324156 7518 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/bootstrap-kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 09:24:42.405401 master-0 kubenswrapper[7518]: I0319 09:24:42.404731 7518 scope.go:117] "RemoveContainer" containerID="7716e20f21898d48a97cdc11ca530decd4b56cabb9557337c593d6dc0a3abe47" Mar 19 09:24:42.410133 master-0 kubenswrapper[7518]: I0319 09:24:42.410097 7518 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:42.410133 master-0 kubenswrapper[7518]: I0319 09:24:42.410125 7518 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:42.410254 master-0 kubenswrapper[7518]: I0319 09:24:42.410140 7518 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:42.410254 master-0 kubenswrapper[7518]: I0319 09:24:42.410153 7518 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:24:42.462116 master-0 kubenswrapper[7518]: I0319 09:24:42.462075 7518 scope.go:117] "RemoveContainer" containerID="a84c1f34c626f1387c9440e1656352bf22e178dc307b15faa17e2d14af155731" Mar 19 09:24:42.464537 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 09:24:42.514069 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 09:24:42.514342 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 09:24:42.519790 master-0 systemd[1]: kubelet.service: Consumed 33.800s CPU time. Mar 19 09:24:42.565201 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:24:42.668318 master-0 kubenswrapper[15202]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:24:42.668318 master-0 kubenswrapper[15202]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:24:42.668318 master-0 kubenswrapper[15202]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:24:42.668318 master-0 kubenswrapper[15202]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:24:42.668318 master-0 kubenswrapper[15202]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:24:42.669002 master-0 kubenswrapper[15202]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:24:42.669002 master-0 kubenswrapper[15202]: I0319 09:24:42.668427 15202 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670876 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670895 15202 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670900 15202 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670905 15202 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670910 15202 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670915 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670920 15202 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670926 15202 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670931 15202 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670936 15202 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670941 15202 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670945 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670950 15202 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670955 15202 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670960 15202 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670964 15202 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670969 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670973 15202 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670977 15202 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:24:42.670999 master-0 kubenswrapper[15202]: W0319 09:24:42.670981 15202 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.670986 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.670999 15202 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671004 15202 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671009 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671015 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671020 15202 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671025 15202 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671030 15202 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671035 15202 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671042 15202 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671049 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671054 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671058 15202 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671064 15202 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671069 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671074 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671079 15202 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671083 15202 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:24:42.672041 master-0 kubenswrapper[15202]: W0319 09:24:42.671087 15202 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671091 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671096 15202 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671099 15202 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671103 15202 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671107 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671111 15202 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671116 15202 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671120 15202 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671124 15202 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671129 15202 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671134 15202 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671138 15202 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671142 15202 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671146 15202 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671150 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671154 15202 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671160 15202 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671165 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671170 15202 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:24:42.672868 master-0 kubenswrapper[15202]: W0319 09:24:42.671174 15202 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671180 15202 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671185 15202 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671189 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671193 15202 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671198 15202 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671203 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671207 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671212 15202 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671218 15202 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671223 15202 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671227 15202 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671231 15202 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: W0319 09:24:42.671235 15202 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671343 15202 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671353 15202 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671361 15202 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671367 15202 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671373 15202 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671378 15202 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:24:42.673617 master-0 kubenswrapper[15202]: I0319 09:24:42.671385 15202 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671392 15202 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671398 15202 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671404 15202 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671409 15202 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671415 15202 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671421 15202 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671428 15202 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671434 15202 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671440 15202 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671445 15202 flags.go:64] FLAG: --cloud-config="" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671452 15202 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671457 15202 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671464 15202 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671472 15202 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671477 15202 flags.go:64] FLAG: --config-dir="" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671534 15202 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671540 15202 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671551 15202 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671556 15202 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671561 15202 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671566 15202 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671571 15202 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671576 15202 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671581 15202 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:24:42.674427 master-0 kubenswrapper[15202]: I0319 09:24:42.671587 15202 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671593 15202 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671630 15202 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671636 15202 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671642 15202 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671647 15202 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671653 15202 flags.go:64] FLAG: --enable-server="true" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671659 15202 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671666 15202 flags.go:64] FLAG: --event-burst="100" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671672 15202 flags.go:64] FLAG: --event-qps="50" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671678 15202 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671683 15202 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671688 15202 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671695 15202 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671701 15202 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671707 15202 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671712 15202 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671717 15202 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671722 15202 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671728 15202 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671734 15202 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671739 15202 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671744 15202 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671749 15202 flags.go:64] FLAG: --feature-gates="" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671755 15202 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:24:42.675392 master-0 kubenswrapper[15202]: I0319 09:24:42.671760 15202 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671804 15202 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671810 15202 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671815 15202 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671820 15202 flags.go:64] FLAG: --help="false" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671825 15202 flags.go:64] FLAG: --hostname-override="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671832 15202 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671842 15202 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671847 15202 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671852 15202 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671857 15202 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671864 15202 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671869 15202 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671875 15202 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671881 15202 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671886 15202 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671892 15202 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671897 15202 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671903 15202 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671908 15202 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671914 15202 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671920 15202 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671925 15202 flags.go:64] FLAG: --lock-file="" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671930 15202 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671935 15202 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:24:42.676515 master-0 kubenswrapper[15202]: I0319 09:24:42.671941 15202 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671949 15202 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671955 15202 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671960 15202 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671965 15202 flags.go:64] FLAG: --logging-format="text" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671970 15202 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671976 15202 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671982 15202 flags.go:64] FLAG: --manifest-url="" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671988 15202 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.671996 15202 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672001 15202 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672007 15202 flags.go:64] FLAG: --max-pods="110" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672012 15202 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672018 15202 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672027 15202 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672031 15202 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672036 15202 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672042 15202 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672047 15202 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672058 15202 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672064 15202 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672070 15202 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672075 15202 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:24:42.677603 master-0 kubenswrapper[15202]: I0319 09:24:42.672080 15202 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672089 15202 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672094 15202 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672099 15202 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672105 15202 flags.go:64] FLAG: --port="10250" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672110 15202 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672115 15202 flags.go:64] FLAG: --provider-id="" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672120 15202 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672125 15202 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672129 15202 flags.go:64] FLAG: --register-node="true" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672167 15202 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672172 15202 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672181 15202 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672185 15202 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672190 15202 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672195 15202 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672200 15202 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672205 15202 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672210 15202 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672215 15202 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672219 15202 flags.go:64] FLAG: --runonce="false" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672224 15202 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672230 15202 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672235 15202 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672243 15202 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672249 15202 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:24:42.678239 master-0 kubenswrapper[15202]: I0319 09:24:42.672254 15202 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672259 15202 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672264 15202 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672269 15202 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672273 15202 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672278 15202 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672282 15202 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672287 15202 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672292 15202 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672298 15202 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672305 15202 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672310 15202 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672315 15202 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672322 15202 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672327 15202 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672332 15202 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672336 15202 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672341 15202 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672345 15202 flags.go:64] FLAG: --v="2" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672351 15202 flags.go:64] FLAG: --version="false" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672359 15202 flags.go:64] FLAG: --vmodule="" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672365 15202 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: I0319 09:24:42.672371 15202 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: W0319 09:24:42.672525 15202 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: W0319 09:24:42.672534 15202 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:24:42.678968 master-0 kubenswrapper[15202]: W0319 09:24:42.672539 15202 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672543 15202 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672547 15202 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672551 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672555 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672563 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672567 15202 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672571 15202 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672575 15202 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672579 15202 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672583 15202 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672587 15202 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672591 15202 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672595 15202 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672599 15202 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672603 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672607 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672611 15202 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672615 15202 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672620 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:24:42.679694 master-0 kubenswrapper[15202]: W0319 09:24:42.672625 15202 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672629 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672633 15202 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672637 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672641 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672645 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672649 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672654 15202 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672659 15202 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672664 15202 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672669 15202 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672673 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672677 15202 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672681 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672687 15202 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672691 15202 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672696 15202 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672703 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672708 15202 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:24:42.680348 master-0 kubenswrapper[15202]: W0319 09:24:42.672714 15202 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672719 15202 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672723 15202 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672729 15202 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672733 15202 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672738 15202 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672743 15202 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672748 15202 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672752 15202 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672756 15202 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672761 15202 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672765 15202 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672769 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672773 15202 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672777 15202 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672781 15202 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672785 15202 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672790 15202 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672795 15202 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672799 15202 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:24:42.681008 master-0 kubenswrapper[15202]: W0319 09:24:42.672803 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672807 15202 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672811 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672815 15202 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672819 15202 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672822 15202 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672826 15202 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672830 15202 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672834 15202 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672839 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.672847 15202 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: I0319 09:24:42.672861 15202 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: I0319 09:24:42.677933 15202 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: I0319 09:24:42.677962 15202 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.678031 15202 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:24:42.681730 master-0 kubenswrapper[15202]: W0319 09:24:42.678039 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678044 15202 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678048 15202 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678052 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678057 15202 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678060 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678064 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678067 15202 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678070 15202 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678074 15202 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678078 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678082 15202 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678085 15202 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678089 15202 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678094 15202 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678098 15202 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678101 15202 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678105 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678108 15202 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678112 15202 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:24:42.682270 master-0 kubenswrapper[15202]: W0319 09:24:42.678117 15202 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678123 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678127 15202 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678131 15202 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678135 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678139 15202 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678143 15202 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678146 15202 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678150 15202 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678155 15202 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678160 15202 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678164 15202 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678167 15202 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678173 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678177 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678181 15202 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678184 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678188 15202 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678192 15202 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678195 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:24:42.683141 master-0 kubenswrapper[15202]: W0319 09:24:42.678198 15202 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678202 15202 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678206 15202 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678209 15202 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678213 15202 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678217 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678222 15202 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678226 15202 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678229 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678233 15202 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678236 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678241 15202 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678244 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678249 15202 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678253 15202 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678257 15202 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678261 15202 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678264 15202 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678268 15202 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678272 15202 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:24:42.683846 master-0 kubenswrapper[15202]: W0319 09:24:42.678276 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678280 15202 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678284 15202 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678289 15202 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678293 15202 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678296 15202 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678300 15202 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678304 15202 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678307 15202 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678311 15202 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678315 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: I0319 09:24:42.678322 15202 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678439 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678447 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678451 15202 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:24:42.684687 master-0 kubenswrapper[15202]: W0319 09:24:42.678456 15202 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678460 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678469 15202 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678472 15202 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678489 15202 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678495 15202 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678501 15202 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678506 15202 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678510 15202 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678514 15202 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678518 15202 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678521 15202 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678525 15202 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678529 15202 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678533 15202 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678537 15202 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678540 15202 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678544 15202 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678547 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678552 15202 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:24:42.685172 master-0 kubenswrapper[15202]: W0319 09:24:42.678558 15202 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678562 15202 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678566 15202 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678570 15202 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678574 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678577 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678581 15202 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678584 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678588 15202 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678592 15202 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678596 15202 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678601 15202 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678605 15202 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678609 15202 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678613 15202 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678616 15202 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678621 15202 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678626 15202 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678630 15202 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:24:42.685847 master-0 kubenswrapper[15202]: W0319 09:24:42.678634 15202 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678637 15202 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678641 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678645 15202 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678648 15202 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678652 15202 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678657 15202 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678661 15202 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678665 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678669 15202 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678673 15202 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678677 15202 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678681 15202 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678685 15202 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678690 15202 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678694 15202 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678699 15202 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678703 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678707 15202 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:24:42.686928 master-0 kubenswrapper[15202]: W0319 09:24:42.678711 15202 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678715 15202 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678719 15202 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678723 15202 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678727 15202 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678731 15202 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678735 15202 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678740 15202 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678744 15202 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678748 15202 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: W0319 09:24:42.678751 15202 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: I0319 09:24:42.678757 15202 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: I0319 09:24:42.678936 15202 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: I0319 09:24:42.680449 15202 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: I0319 09:24:42.680584 15202 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 09:24:42.687629 master-0 kubenswrapper[15202]: I0319 09:24:42.680779 15202 server.go:997] "Starting client certificate rotation" Mar 19 09:24:42.688172 master-0 kubenswrapper[15202]: I0319 09:24:42.680788 15202 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:24:42.688172 master-0 kubenswrapper[15202]: I0319 09:24:42.681126 15202 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 06:29:53.388197213 +0000 UTC Mar 19 09:24:42.688172 master-0 kubenswrapper[15202]: I0319 09:24:42.681372 15202 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h5m10.706836002s for next certificate rotation Mar 19 09:24:42.688172 master-0 kubenswrapper[15202]: I0319 09:24:42.681994 15202 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:24:42.688172 master-0 kubenswrapper[15202]: I0319 09:24:42.683601 15202 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:24:42.691731 master-0 kubenswrapper[15202]: I0319 09:24:42.691643 15202 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:24:42.698173 master-0 kubenswrapper[15202]: I0319 09:24:42.698133 15202 log.go:25] "Validated CRI v1 image API" Mar 19 09:24:42.699830 master-0 kubenswrapper[15202]: I0319 09:24:42.699771 15202 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:24:42.713619 master-0 kubenswrapper[15202]: I0319 09:24:42.712923 15202 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 a870f5cc-57ed-47cd-b7c0-f85f1fc0e63d:/dev/vda3] Mar 19 09:24:42.714189 master-0 kubenswrapper[15202]: I0319 09:24:42.712975 15202 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56/userdata/shm major:0 minor:141 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600/userdata/shm major:0 minor:809 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e/userdata/shm major:0 minor:878 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25/userdata/shm major:0 minor:812 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/15eda3bde3926ace98dc82fe5b6fb4d1ace5d01b315a5e6ece92e5b50ae9132e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/15eda3bde3926ace98dc82fe5b6fb4d1ace5d01b315a5e6ece92e5b50ae9132e/userdata/shm major:0 minor:304 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe/userdata/shm major:0 minor:788 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ebe3fb9cab9178261c34fb487eaacac7fa326d405ced605571d043522371ecf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ebe3fb9cab9178261c34fb487eaacac7fa326d405ced605571d043522371ecf/userdata/shm major:0 minor:597 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2f229196290719614f7bbcbd70dc1d3eb6df4440414271052c0e25cb9764e057/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2f229196290719614f7bbcbd70dc1d3eb6df4440414271052c0e25cb9764e057/userdata/shm major:0 minor:695 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210/userdata/shm major:0 minor:107 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3580cc8aaddf6d9ceec4e9655520a84a1d14647aea74906c068b15c17cd230e2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3580cc8aaddf6d9ceec4e9655520a84a1d14647aea74906c068b15c17cd230e2/userdata/shm major:0 minor:466 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3b84ff6dcb01c2864416447de1ea9c58a9ceb02e0ee8e948fe0ed652019990a3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3b84ff6dcb01c2864416447de1ea9c58a9ceb02e0ee8e948fe0ed652019990a3/userdata/shm major:0 minor:330 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5/userdata/shm major:0 minor:658 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5/userdata/shm major:0 minor:679 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4316d2211dbd015d08696560a401855badad02d5162bd18e5f9b36a4aa80b6a7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4316d2211dbd015d08696560a401855badad02d5162bd18e5f9b36a4aa80b6a7/userdata/shm major:0 minor:80 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb/userdata/shm major:0 minor:52 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991/userdata/shm major:0 minor:459 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4d0ada47f0cb160d98966d63d8a86c801dfaedca21b5932c03647c7678f530ef/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4d0ada47f0cb160d98966d63d8a86c801dfaedca21b5932c03647c7678f530ef/userdata/shm major:0 minor:457 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500/userdata/shm major:0 minor:468 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204/userdata/shm major:0 minor:103 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/58d5b64552b14fa37f1c4ade1890dfcbcf78def52cdf495457e904377a1b0a43/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/58d5b64552b14fa37f1c4ade1890dfcbcf78def52cdf495457e904377a1b0a43/userdata/shm major:0 minor:758 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207/userdata/shm major:0 minor:385 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5ae9a109908423db1f7d35a526931ed2af44da77833edc3112c7f12de82644eb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5ae9a109908423db1f7d35a526931ed2af44da77833edc3112c7f12de82644eb/userdata/shm major:0 minor:808 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c/userdata/shm major:0 minor:803 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/62125637029a850812cbf1a1551ac9bf8a2431cbf9d2111e28185931308bf215/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/62125637029a850812cbf1a1551ac9bf8a2431cbf9d2111e28185931308bf215/userdata/shm major:0 minor:458 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/633fccc65fe5856fecc01dbcc7e58f5190eed4eb98e5e73385a0e9bcc0746e0e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/633fccc65fe5856fecc01dbcc7e58f5190eed4eb98e5e73385a0e9bcc0746e0e/userdata/shm major:0 minor:799 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6b70c6219cee771d6e858549f53b5dbf8004794c49061a1d0481404af45e4772/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6b70c6219cee771d6e858549f53b5dbf8004794c49061a1d0481404af45e4772/userdata/shm major:0 minor:772 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/78ed7df2de04c4d9012bf3b0bae0730cc7f525024f23a27fe0e47c32e46b41f6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/78ed7df2de04c4d9012bf3b0bae0730cc7f525024f23a27fe0e47c32e46b41f6/userdata/shm major:0 minor:488 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42/userdata/shm major:0 minor:147 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7/userdata/shm major:0 minor:618 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/83f05b1eef52787aaeaed1465a46122a61b271c0e893c29d510caa22b344a675/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/83f05b1eef52787aaeaed1465a46122a61b271c0e893c29d510caa22b344a675/userdata/shm major:0 minor:390 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88dd8210417d34cd695549010f86bdfe2541add1af48e0e0b07c7ed8f524f103/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88dd8210417d34cd695549010f86bdfe2541add1af48e0e0b07c7ed8f524f103/userdata/shm major:0 minor:514 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c/userdata/shm major:0 minor:247 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/91645cb3e01b7383bf3c741eaf023e22432d4ae51de307f2749e304f203b0c13/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/91645cb3e01b7383bf3c741eaf023e22432d4ae51de307f2749e304f203b0c13/userdata/shm major:0 minor:818 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209/userdata/shm major:0 minor:823 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687/userdata/shm major:0 minor:86 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9bea8e39775551acb259adea0fc4cfd103c16875f290afb2712a31409a51f01c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9bea8e39775551acb259adea0fc4cfd103c16875f290afb2712a31409a51f01c/userdata/shm major:0 minor:382 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9f7751b6243f5b55d5db7507e92a7214e3b051f064f66c13d1a6b5d546c577a0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9f7751b6243f5b55d5db7507e92a7214e3b051f064f66c13d1a6b5d546c577a0/userdata/shm major:0 minor:461 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43/userdata/shm major:0 minor:126 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741/userdata/shm major:0 minor:596 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d/userdata/shm major:0 minor:53 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1/userdata/shm major:0 minor:378 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a/userdata/shm major:0 minor:464 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dd1819a433e70ea4c2b01b165e8a76f6644d7959ff5dbef7efb1f362b56038c1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dd1819a433e70ea4c2b01b165e8a76f6644d7959ff5dbef7efb1f362b56038c1/userdata/shm major:0 minor:300 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330/userdata/shm major:0 minor:408 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4c44a8a218f4d3a8bf81e0a2e78942dceac9d2b2c4c60ba4ca23a60c107ed3b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4c44a8a218f4d3a8bf81e0a2e78942dceac9d2b2c4c60ba4ca23a60c107ed3b/userdata/shm major:0 minor:821 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4c8b86557bfc322b9f1b1feea17aaefaf34263c41685f5164a347ec08c589e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4c8b86557bfc322b9f1b1feea17aaefaf34263c41685f5164a347ec08c589e8/userdata/shm major:0 minor:764 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc/userdata/shm major:0 minor:804 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37/userdata/shm major:0 minor:308 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f64ca30d2cf598d32dbab617a0a172e7aa2a1cb9512109dd3142530e06881cb4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f64ca30d2cf598d32dbab617a0a172e7aa2a1cb9512109dd3142530e06881cb4/userdata/shm major:0 minor:816 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f7e0d1fae2c29d1550044dbfbc303fe4f5bb6dc47066c479df51113017952abe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f7e0d1fae2c29d1550044dbfbc303fe4f5bb6dc47066c479df51113017952abe/userdata/shm major:0 minor:465 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes/kubernetes.io~projected/kube-api-access-sqzn8:{mountpoint:/var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes/kubernetes.io~projected/kube-api-access-sqzn8 major:0 minor:117 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes/kubernetes.io~secret/serving-cert major:0 minor:114 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~projected/kube-api-access-mlwd5:{mountpoint:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~projected/kube-api-access-mlwd5 major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cb70a30-a8d1-4037-81e6-eb4f0510a234/volumes/kubernetes.io~projected/kube-api-access-q7x89:{mountpoint:/var/lib/kubelet/pods/0cb70a30-a8d1-4037-81e6-eb4f0510a234/volumes/kubernetes.io~projected/kube-api-access-q7x89 major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cb70a30-a8d1-4037-81e6-eb4f0510a234/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0cb70a30-a8d1-4037-81e6-eb4f0510a234/volumes/kubernetes.io~secret/serving-cert major:0 minor:784 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/141cb120-92da-4d8d-bc29-fc4c433a6336/volumes/kubernetes.io~projected/kube-api-access-fhwd7:{mountpoint:/var/lib/kubelet/pods/141cb120-92da-4d8d-bc29-fc4c433a6336/volumes/kubernetes.io~projected/kube-api-access-fhwd7 major:0 minor:793 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/141cb120-92da-4d8d-bc29-fc4c433a6336/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/141cb120-92da-4d8d-bc29-fc4c433a6336/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:783 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~projected/ca-certs major:0 minor:490 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~projected/kube-api-access-gzntq:{mountpoint:/var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~projected/kube-api-access-gzntq major:0 minor:491 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~projected/kube-api-access-47czp:{mountpoint:/var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~projected/kube-api-access-47czp major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:95 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~projected/kube-api-access-lktk8:{mountpoint:/var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~projected/kube-api-access-lktk8 major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~secret/srv-cert major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/307605e6-d1cf-4172-8e7d-918c435f3577/volumes/kubernetes.io~projected/kube-api-access-wrs54:{mountpoint:/var/lib/kubelet/pods/307605e6-d1cf-4172-8e7d-918c435f3577/volumes/kubernetes.io~projected/kube-api-access-wrs54 major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/31742478-0d83-48cf-b38b-02416d95d4a8/volumes/kubernetes.io~projected/kube-api-access-wz7d6:{mountpoint:/var/lib/kubelet/pods/31742478-0d83-48cf-b38b-02416d95d4a8/volumes/kubernetes.io~projected/kube-api-access-wz7d6 major:0 minor:791 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/31742478-0d83-48cf-b38b-02416d95d4a8/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/31742478-0d83-48cf-b38b-02416d95d4a8/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:782 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~projected/kube-api-access-npxz5:{mountpoint:/var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~projected/kube-api-access-npxz5 major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:451 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~projected/kube-api-access-8zvxj:{mountpoint:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~projected/kube-api-access-8zvxj major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39bf78ac-304b-4b82-8729-d184657ef3bb/volumes/kubernetes.io~projected/kube-api-access-rltcj:{mountpoint:/var/lib/kubelet/pods/39bf78ac-304b-4b82-8729-d184657ef3bb/volumes/kubernetes.io~projected/kube-api-access-rltcj major:0 minor:806 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~projected/kube-api-access-vr9dj:{mountpoint:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~projected/kube-api-access-vr9dj major:0 minor:595 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/encryption-config major:0 minor:593 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/etcd-client major:0 minor:592 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/serving-cert major:0 minor:594 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~projected/kube-api-access-ptcvr:{mountpoint:/var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~projected/kube-api-access-ptcvr major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~secret/metrics-certs major:0 minor:100 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/467c2f01-2c23-41e2-acb9-08a84061fefc/volumes/kubernetes.io~projected/kube-api-access-mxtcq:{mountpoint:/var/lib/kubelet/pods/467c2f01-2c23-41e2-acb9-08a84061fefc/volumes/kubernetes.io~projected/kube-api-access-mxtcq major:0 minor:890 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/467c2f01-2c23-41e2-acb9-08a84061fefc/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/467c2f01-2c23-41e2-acb9-08a84061fefc/volumes/kubernetes.io~secret/proxy-tls major:0 minor:889 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~projected/kube-api-access-7k8wj:{mountpoint:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~projected/kube-api-access-7k8wj major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~projected/kube-api-access-4n2hg:{mountpoint:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~projected/kube-api-access-4n2hg major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~projected/kube-api-access major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:259 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/kube-api-access-548cd:{mountpoint:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/kube-api-access-548cd major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~secret/metrics-tls major:0 minor:454 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~projected/kube-api-access-qv8vk:{mountpoint:/var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~projected/kube-api-access-qv8vk major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:101 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:585 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~empty-dir/tmp major:0 minor:584 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~projected/kube-api-access-rnfsx:{mountpoint:/var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~projected/kube-api-access-rnfsx major:0 minor:586 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~projected/kube-api-access-qh4t8:{mountpoint:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~projected/kube-api-access-qh4t8 major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~projected/kube-api-access-qn48v:{mountpoint:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~projected/kube-api-access-qn48v major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/872e5f8c-b014-4283-a4d2-0e2cfd29e192/volumes/kubernetes.io~projected/kube-api-access-kfpv6:{mountpoint:/var/lib/kubelet/pods/872e5f8c-b014-4283-a4d2-0e2cfd29e192/volumes/kubernetes.io~projected/kube-api-access-kfpv6 major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7/volumes/kubernetes.io~projected/kube-api-access-n49x9:{mountpoint:/var/lib/kubelet/pods/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7/volumes/kubernetes.io~projected/kube-api-access-n49x9 major:0 minor:769 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/89be0036-a2c8-48b4-9eaf-17fab972c4f4/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/89be0036-a2c8-48b4-9eaf-17fab972c4f4/volumes/kubernetes.io~projected/kube-api-access major:0 minor:754 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~projected/kube-api-access-ft9rs:{mountpoint:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~projected/kube-api-access-ft9rs major:0 minor:99 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~projected/kube-api-access-m4rtm:{mountpoint:/var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~projected/kube-api-access-m4rtm major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~secret/srv-cert major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~projected/kube-api-access-n6zkv:{mountpoint:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~projected/kube-api-access-n6zkv major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/etcd-client major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~projected/kube-api-access-fp46p:{mountpoint:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~projected/kube-api-access-fp46p major:0 minor:135 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:134 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~projected/kube-api-access-bgmwd:{mountpoint:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~projected/kube-api-access-bgmwd major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:453 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:450 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ca444a4-4d78-456f-9656-0c28076ce77e/volumes/kubernetes.io~projected/kube-api-access-kt22g:{mountpoint:/var/lib/kubelet/pods/9ca444a4-4d78-456f-9656-0c28076ce77e/volumes/kubernetes.io~projected/kube-api-access-kt22g major:0 minor:786 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ca444a4-4d78-456f-9656-0c28076ce77e/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/9ca444a4-4d78-456f-9656-0c28076ce77e/volumes/kubernetes.io~secret/proxy-tls major:0 minor:779 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc/volumes/kubernetes.io~projected/kube-api-access-2zz2n:{mountpoint:/var/lib/kubelet/pods/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc/volumes/kubernetes.io~projected/kube-api-access-2zz2n major:0 minor:363 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/kube-api-access-wdmtg:{mountpoint:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/kube-api-access-wdmtg major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:452 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~projected/kube-api-access-4mvqh:{mountpoint:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~projected/kube-api-access-4mvqh major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~projected/kube-api-access major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~projected/kube-api-access-x9zg8:{mountpoint:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~projected/kube-api-access-x9zg8 major:0 minor:148 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~secret/webhook-cert major:0 minor:146 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~projected/kube-api-access-s2ntw:{mountpoint:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~projected/kube-api-access-s2ntw major:0 minor:500 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/encryption-config major:0 minor:455 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/etcd-client major:0 minor:449 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/serving-cert major:0 minor:472 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8f39c16-3a94-45c3-a51c-f2e81eff967d/volumes/kubernetes.io~projected/kube-api-access-qmdlx:{mountpoint:/var/lib/kubelet/pods/b8f39c16-3a94-45c3-a51c-f2e81eff967d/volumes/kubernetes.io~projected/kube-api-access-qmdlx major:0 minor:591 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8f39c16-3a94-45c3-a51c-f2e81eff967d/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/b8f39c16-3a94-45c3-a51c-f2e81eff967d/volumes/kubernetes.io~secret/metrics-tls major:0 minor:579 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bec90db1-02e3-4211-8c33-f8bcc304e3a7/volumes/kubernetes.io~projected/kube-api-access-nr5cd:{mountpoint:/var/lib/kubelet/pods/bec90db1-02e3-4211-8c33-f8bcc304e3a7/volumes/kubernetes.io~projected/kube-api-access-nr5cd major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2a16f6f-437c-4da5-a797-287e5e1ddbd4/volumes/kubernetes.io~projected/kube-api-access-ws5kr:{mountpoint:/var/lib/kubelet/pods/c2a16f6f-437c-4da5-a797-287e5e1ddbd4/volumes/kubernetes.io~projected/kube-api-access-ws5kr major:0 minor:794 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2a16f6f-437c-4da5-a797-287e5e1ddbd4/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/c2a16f6f-437c-4da5-a797-287e5e1ddbd4/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:780 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~projected/kube-api-access-dxdb6:{mountpoint:/var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~projected/kube-api-access-dxdb6 major:0 minor:798 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~secret/cert major:0 minor:785 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:778 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes/kubernetes.io~projected/kube-api-access-w6qs5:{mountpoint:/var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes/kubernetes.io~projected/kube-api-access-w6qs5 major:0 minor:790 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:781 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d32541c9-eef6-417c-9f5a-a7392dc70aa0/volumes/kubernetes.io~projected/kube-api-access-fvp9m:{mountpoint:/var/lib/kubelet/pods/d32541c9-eef6-417c-9f5a-a7392dc70aa0/volumes/kubernetes.io~projected/kube-api-access-fvp9m major:0 minor:796 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d32541c9-eef6-417c-9f5a-a7392dc70aa0/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/d32541c9-eef6-417c-9f5a-a7392dc70aa0/volumes/kubernetes.io~secret/cert major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d486ce23-acf7-429a-9739-4770e1a2bf78/volumes/kubernetes.io~projected/kube-api-access-bzdjs:{mountpoint:/var/lib/kubelet/pods/d486ce23-acf7-429a-9739-4770e1a2bf78/volumes/kubernetes.io~projected/kube-api-access-bzdjs major:0 minor:340 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d486ce23-acf7-429a-9739-4770e1a2bf78/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/d486ce23-acf7-429a-9739-4770e1a2bf78/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d52fa1ad-0071-4506-bb94-e73d6f15a75c/volumes/kubernetes.io~projected/kube-api-access-xvg4q:{mountpoint:/var/lib/kubelet/pods/d52fa1ad-0071-4506-bb94-e73d6f15a75c/volumes/kubernetes.io~projected/kube-api-access-xvg4q major:0 minor:608 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d90f590a-6118-4769-b18f-fec67dd62c20/volumes/kubernetes.io~projected/kube-api-access-nljb2:{mountpoint:/var/lib/kubelet/pods/d90f590a-6118-4769-b18f-fec67dd62c20/volumes/kubernetes.io~projected/kube-api-access-nljb2 major:0 minor:388 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d90f590a-6118-4769-b18f-fec67dd62c20/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/d90f590a-6118-4769-b18f-fec67dd62c20/volumes/kubernetes.io~secret/signing-key major:0 minor:387 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db42b38e-294e-4016-8ac1-54126ac60de8/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/db42b38e-294e-4016-8ac1-54126ac60de8/volumes/kubernetes.io~projected/ca-certs major:0 minor:478 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db42b38e-294e-4016-8ac1-54126ac60de8/volumes/kubernetes.io~projected/kube-api-access-8dwx6:{mountpoint:/var/lib/kubelet/pods/db42b38e-294e-4016-8ac1-54126ac60de8/volumes/kubernetes.io~projected/kube-api-access-8dwx6 major:0 minor:487 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dc9945ac-4041-4120-b504-a173c2bf91bd/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/dc9945ac-4041-4120-b504-a173c2bf91bd/volumes/kubernetes.io~projected/kube-api-access major:0 minor:319 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dc9945ac-4041-4120-b504-a173c2bf91bd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/dc9945ac-4041-4120-b504-a173c2bf91bd/volumes/kubernetes.io~secret/serving-cert major:0 minor:318 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dd69fc33-59d4-4538-b4ec-e2d08ac11f72/volumes/kubernetes.io~projected/kube-api-access-txp58:{mountpoint:/var/lib/kubelet/pods/dd69fc33-59d4-4538-b4ec-e2d08ac11f72/volumes/kubernetes.io~projected/kube-api-access-txp58 major:0 minor:770 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dea35f60-33be-4ccc-b985-952eac3a85c0/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/dea35f60-33be-4ccc-b985-952eac3a85c0/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:91 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e09725c2-45c6-4a60-b817-6e5316d6f8e8/volumes/kubernetes.io~projected/ku Mar 19 09:24:42.714539 master-0 kubenswrapper[15202]: be-api-access-b49lj:{mountpoint:/var/lib/kubelet/pods/e09725c2-45c6-4a60-b817-6e5316d6f8e8/volumes/kubernetes.io~projected/kube-api-access-b49lj major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3376275-294d-446d-9b4c-930df60dba01/volumes/kubernetes.io~projected/kube-api-access-cgsm7:{mountpoint:/var/lib/kubelet/pods/e3376275-294d-446d-9b4c-930df60dba01/volumes/kubernetes.io~projected/kube-api-access-cgsm7 major:0 minor:383 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e9ebcecb-c210-434e-83a1-825265e206f1/volumes/kubernetes.io~projected/kube-api-access-txxpw:{mountpoint:/var/lib/kubelet/pods/e9ebcecb-c210-434e-83a1-825265e206f1/volumes/kubernetes.io~projected/kube-api-access-txxpw major:0 minor:110 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~projected/kube-api-access-pvq8m:{mountpoint:/var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~projected/kube-api-access-pvq8m major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~secret/metrics-tls major:0 minor:456 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~projected/kube-api-access-fsdjh:{mountpoint:/var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~projected/kube-api-access-fsdjh major:0 minor:817 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:814 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~secret/webhook-cert major:0 minor:813 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f1943401-a75b-4e45-8c65-3cc36018d8c4/volumes/kubernetes.io~projected/kube-api-access-8cxfs:{mountpoint:/var/lib/kubelet/pods/f1943401-a75b-4e45-8c65-3cc36018d8c4/volumes/kubernetes.io~projected/kube-api-access-8cxfs major:0 minor:863 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~projected/kube-api-access-bd8nz:{mountpoint:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~projected/kube-api-access-bd8nz major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8fdab32-4e61-4e9c-a506-52121f625669/volumes/kubernetes.io~projected/kube-api-access-5xl5z:{mountpoint:/var/lib/kubelet/pods/f8fdab32-4e61-4e9c-a506-52121f625669/volumes/kubernetes.io~projected/kube-api-access-5xl5z major:0 minor:678 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8fdab32-4e61-4e9c-a506-52121f625669/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/f8fdab32-4e61-4e9c-a506-52121f625669/volumes/kubernetes.io~secret/webhook-certs major:0 minor:648 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f93b8728-4a33-4ee4-b7c6-cff7d7995953/volumes/kubernetes.io~projected/kube-api-access-kfw5k:{mountpoint:/var/lib/kubelet/pods/f93b8728-4a33-4ee4-b7c6-cff7d7995953/volumes/kubernetes.io~projected/kube-api-access-kfw5k major:0 minor:789 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f93b8728-4a33-4ee4-b7c6-cff7d7995953/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/f93b8728-4a33-4ee4-b7c6-cff7d7995953/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:787 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes/kubernetes.io~projected/kube-api-access-2p6wn:{mountpoint:/var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes/kubernetes.io~projected/kube-api-access-2p6wn major:0 minor:116 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes/kubernetes.io~secret/serving-cert major:0 minor:115 fsType:tmpfs blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/3b97a8f1b9aa695d2b507e36574de13fb7b2d893d3f7915c2ff188f5dfa3ce89/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/adeb790a2e31fe1b9b264b3c4c44d9de1f0407f9dae672fec0bc6364f6ae83fd/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/7c1d03227cc514591244dc86b0488b2dd57a2f0120c552b2e59e9b0db70adbda/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-128:{mountpoint:/var/lib/containers/storage/overlay/27837f8a552897312a5df71dc51f67a7f2566ef138ed2ef3eb8acff9f26aa0c2/merged major:0 minor:128 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/cc31dfd2379c9971b69afc9947cf4c7ac982d3db70c2b85aa2250c5bc552f1e4/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/664defcb16835a98985b80e688f010e510b521e3c7bbf8694bec00a9717f5846/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-143:{mountpoint:/var/lib/containers/storage/overlay/16e4be12f350c7085bb021957915aa09d93d058a8fbd6f621bf8850678f5b02d/merged major:0 minor:143 fsType:overlay blockSize:0} overlay_0-145:{mountpoint:/var/lib/containers/storage/overlay/f0ce9109e3f2ea0960d438756d371d4f67f25534e78d031377afc9d510035bc6/merged major:0 minor:145 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/3a803f486f6d8c4b685d88ab54fcdb5f980a4adc9a2570fdcc385ad92adbd79c/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/92ed9422a8a52a0b815c358138f5c4ce5582906c2a8e1b9c290049a52c6b7a0b/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/12769642dae853082d9d0189953db8bea65bcfec768237b64fe7dda87fbc2a20/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-165:{mountpoint:/var/lib/containers/storage/overlay/0e6e6544a0f5b188fde00cb534b87d94aa8cffc724a593eacd40cccdd26bdc64/merged major:0 minor:165 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/a764f296423c1358459855623b54a76a7b4a675e9a54fdfc94d5f6860ae165d4/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/75fb4672b66560f1f37663437160beb7c1238bb2a2a11bdf8de39e030c94f89e/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/88d58b0aef7f9e8851dfa4e94ff3c2f1a038ffd451a5d817070acf94fe3c0436/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/1a268ffb078841f019c6c17a4b112c9528c058506abaa493dd32ddee8a90c13d/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/9604cc5fda4c68a8ea9e20e3cfbf191b34fee6153482c301760a9b4f1840a1de/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/badc0396407a89da87562b0406ef41865cfbc02c14b3633ca5291e241772ec31/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/639af4189b80d4c11315fdecd4017cc1666a4da28b619bde56a08992623a7f2d/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/db818d7ee7fbc6743e6cf78d723adedb29810fd455fac9767ee4c5b5faf10fae/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-264:{mountpoint:/var/lib/containers/storage/overlay/8dda2f860fbc36b9f750d0c72fbdbf3e38b965f3f0a7e8d692add25e1b8f7ee8/merged major:0 minor:264 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/f5fa9ce224d3935abdab9fcc2d7b187841600879a7b8506f76f36d505b15bb06/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/ff0ae64d4e511bdd3c156810246aa5c2a6bed651e0fe62a07f14ce1814b4ef48/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/b6713f448c2700caf59c43d9e781c3675ba3cb1c595933f30146cfd29a152c8b/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/5fde0e7afae152fde48b25fba2aa481ad10fa1cff01edde0fd4aa56aa5fc5ecc/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/1a5d06ca1cfee170c56823e73d61225cede0610e23730ad975030ebb61a4f92c/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/ed0c97368b53bd6e86d24abb85153543a3379fe0eee724199d2ad22ea19aa1df/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/1ae6a436a8f1d4191e9028cc4b927bac6a164407f90cbfdf5502380940e02ef7/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/ced8babf02c9aad5924da9fcea6dc8cf9e8d31ac96d234acc2a3b23b155c3343/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/39ff757ba9cb89cbaad35b01108900675af235291146ab968ab26a1e3909d92f/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/2d3aa458443bac5d28c98dcc664d0b59520eb448dd976ff47d4132081c46d4c9/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/1e941401408c85a846df02d8c24cef1fbd649f6297281f42aa2244993161f1e9/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/fd77557ff2dd557b9dbb3cbd729e994fa36c6975a8dfe8ab15d66fee2bf4912f/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-306:{mountpoint:/var/lib/containers/storage/overlay/7f36bf6c9c6d626ecd848086278e62715a924bd46aa1903c07f21c41904854e0/merged major:0 minor:306 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/4d0a9786e09b8cf31b92a4ee7152c236a20098ce645c73347509726b6c105b61/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-312:{mountpoint:/var/lib/containers/storage/overlay/d40d65bc794ea52e49ffc1d4ca36c0e30efd448f7b072e7671130e757d70b9e9/merged major:0 minor:312 fsType:overlay blockSize:0} overlay_0-314:{mountpoint:/var/lib/containers/storage/overlay/cc6b08343b881f6aa55a64c66fc7a1ad28425a8a338a9181fa84b9c800f4bcdd/merged major:0 minor:314 fsType:overlay blockSize:0} overlay_0-324:{mountpoint:/var/lib/containers/storage/overlay/04c6e8098ce99326640365bc29499d11b6b732ea4c2e4d5379b2043672e20627/merged major:0 minor:324 fsType:overlay blockSize:0} overlay_0-326:{mountpoint:/var/lib/containers/storage/overlay/fa1d6984ef4063b3e6aabc54549fdc6a274cb5c2936d95bd933b8fae92ec2192/merged major:0 minor:326 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/58f40041cd6b55c6fb65d0ebebfef22eb009a714e732d1ab7c0360a184a4ada4/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-334:{mountpoint:/var/lib/containers/storage/overlay/c0a153b60e0f91b9a13b24863c1ce84cb8697bf6cfa99d4314591a68b19e65a0/merged major:0 minor:334 fsType:overlay blockSize:0} overlay_0-335:{mountpoint:/var/lib/containers/storage/overlay/323665babd02903b2c28aa5cf53bec5acb792f4b7eeab1ea51cec00ff5b03643/merged major:0 minor:335 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/065c63fd9ca9975aae5df253904f461afcaa48f964413798cfb9a10b75891887/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/c123cc9f69220b6687c7fc164c506bc579bcee3a96976ab0daff16b603049589/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-359:{mountpoint:/var/lib/containers/storage/overlay/7506cf9f56b4f325fd0dc6e33b7bb88e339d59ee29312467d6bf200aa56fe3e9/merged major:0 minor:359 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/34b21783cc246f39814b1b8bbe16f323531c9df6f2a853644f099e15232f92ee/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-380:{mountpoint:/var/lib/containers/storage/overlay/ad4619344b5c8bc904a9d3f3a5606db2c9aec4ce979927066c178bfb2cb80a66/merged major:0 minor:380 fsType:overlay blockSize:0} overlay_0-386:{mountpoint:/var/lib/containers/storage/overlay/c667bd2320761d24a8495de7afd4bfa56791ba293d69500768876a90aa544937/merged major:0 minor:386 fsType:overlay blockSize:0} overlay_0-392:{mountpoint:/var/lib/containers/storage/overlay/d937355b7af709a2e5d064d3fa91a10f330a3144ac1860999ca8ec259182de72/merged major:0 minor:392 fsType:overlay blockSize:0} overlay_0-396:{mountpoint:/var/lib/containers/storage/overlay/12dae8c3e185c35b35182181f50ff6ded38392fe69a87fe254c1a6da33d374b9/merged major:0 minor:396 fsType:overlay blockSize:0} overlay_0-398:{mountpoint:/var/lib/containers/storage/overlay/2dfb2df8a6466cf9b6c5dcd15c7b242be395cbd18ee8b3f9026e080853fc3b1b/merged major:0 minor:398 fsType:overlay blockSize:0} overlay_0-400:{mountpoint:/var/lib/containers/storage/overlay/4a3766add3c3e56541767fc0230636b2ac0ba185f7cc92afd8f14d6d9d705150/merged major:0 minor:400 fsType:overlay blockSize:0} overlay_0-401:{mountpoint:/var/lib/containers/storage/overlay/ef25a1e01cd3c853fd495d60c21c2a9ff2e7a39884251c902778c9bee6a490c0/merged major:0 minor:401 fsType:overlay blockSize:0} overlay_0-414:{mountpoint:/var/lib/containers/storage/overlay/f8d18a388ce8747448cc74b57b98629c2d14ef2b05c1ed66e3bd68360b3ac7d7/merged major:0 minor:414 fsType:overlay blockSize:0} overlay_0-416:{mountpoint:/var/lib/containers/storage/overlay/57e18d93345527840c67c9751c6f62a11530306b1729e59dcfa0ea83bd6eebd2/merged major:0 minor:416 fsType:overlay blockSize:0} overlay_0-418:{mountpoint:/var/lib/containers/storage/overlay/15ffd40103e8c88b51f16e0211e58ccc4b1268a92a04716a8c7362afdca71717/merged major:0 minor:418 fsType:overlay blockSize:0} overlay_0-419:{mountpoint:/var/lib/containers/storage/overlay/210ad8915fa354da24005ad72363ab17bbbab98f883997c4d1f9ad37c505b81e/merged major:0 minor:419 fsType:overlay blockSize:0} overlay_0-42:{mountpoint:/var/lib/containers/storage/overlay/9e907272817cf07ee48f555da897b25c068f7d602a33ffb783682f7e606128b8/merged major:0 minor:42 fsType:overlay blockSize:0} overlay_0-438:{mountpoint:/var/lib/containers/storage/overlay/011fee008c079fec093dff1c36a3fe70cd902f83e488a628112e6569c1b50e5f/merged major:0 minor:438 fsType:overlay blockSize:0} overlay_0-441:{mountpoint:/var/lib/containers/storage/overlay/03664166e7e7fee7d21d1f7ca54088a3ca1018919a9b01636047d456b089dd4e/merged major:0 minor:441 fsType:overlay blockSize:0} overlay_0-443:{mountpoint:/var/lib/containers/storage/overlay/5bcbca3d9b428639cca2ebb6b71d3b39b880cacc33ae8a6a80dc2b677ad8820c/merged major:0 minor:443 fsType:overlay blockSize:0} overlay_0-445:{mountpoint:/var/lib/containers/storage/overlay/6d4ad7388a4eae197dd3024660c11ec2ef320b8452b7dcba58deaca3d7e975ef/merged major:0 minor:445 fsType:overlay blockSize:0} overlay_0-447:{mountpoint:/var/lib/containers/storage/overlay/7965840b7d6e5d828e144b6c4159cf592c0816765dbb51d653eaa2948226056b/merged major:0 minor:447 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/10136155de1f48ef9cda25f8666729574feebba423dd63d6bad6ae979448b7d5/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-473:{mountpoint:/var/lib/containers/storage/overlay/eb874886faf0eadd36ba980da0d3fc8fc2f5c409e19c17df1901abcd5ddc54a7/merged major:0 minor:473 fsType:overlay blockSize:0} overlay_0-475:{mountpoint:/var/lib/containers/storage/overlay/1bede61ac245fda30ca1ee82adce3f1c09959d96eb36ea473171c7307c4a7f22/merged major:0 minor:475 fsType:overlay blockSize:0} overlay_0-477:{mountpoint:/var/lib/containers/storage/overlay/7e419d282eef40fd49a49c3d93d39016a3a7b7231ef2d996c153daee2a1dd49f/merged major:0 minor:477 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/30cc68e416cbd92ed84e5e6c75545d990a2416335f14354f914d31c6a659becf/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-481:{mountpoint:/var/lib/containers/storage/overlay/e9ba15d4c377a63af51361bb553f88cc191194fc7e468d9c345e4abe15bb793b/merged major:0 minor:481 fsType:overlay blockSize:0} overlay_0-483:{mountpoint:/var/lib/containers/storage/overlay/c4910d3beae6aa6428b591c51afe28af183dede5200aefa90b888b665121a6fd/merged major:0 minor:483 fsType:overlay blockSize:0} overlay_0-485:{mountpoint:/var/lib/containers/storage/overlay/890211886b521b1cf041e83c516ca4b362202f6ee5c35512a90566b728ef63f0/merged major:0 minor:485 fsType:overlay blockSize:0} overlay_0-494:{mountpoint:/var/lib/containers/storage/overlay/aef20de8bbc7e5d5cdfbaceba1b94939990a7fd8597fccfb9a583668efbffb6b/merged major:0 minor:494 fsType:overlay blockSize:0} overlay_0-496:{mountpoint:/var/lib/containers/storage/overlay/833d70457820574fa849c85282f92928a9ea17eb78b9e0670a5d2ea8c38d5916/merged major:0 minor:496 fsType:overlay blockSize:0} overlay_0-498:{mountpoint:/var/lib/containers/storage/overlay/da1d89755731f5637b027dbedeb632ea5d4aeef7562eebb1db7b8c14561487a0/merged major:0 minor:498 fsType:overlay blockSize:0} overlay_0-50:{mountpoint:/var/lib/containers/storage/overlay/fc4d3cb599973026d25526d5b27a0fd112bc38b75558d91b18bb7dd11dadb292/merged major:0 minor:50 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/90a67eaff5a046ec10246fe9fca1f3fa741ba2a149ff1a6061cd2dbe5c21507e/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-502:{mountpoint:/var/lib/containers/storage/overlay/e3fe58ea8879fd7aabfa3646a0bc125fc2c066efcbb5c6c25116f056d565afab/merged major:0 minor:502 fsType:overlay blockSize:0} overlay_0-504:{mountpoint:/var/lib/containers/storage/overlay/8ebc2fb19a8a047e39051e48423adbb2519acb82903e58e0e4cd9fd95b4ebc8a/merged major:0 minor:504 fsType:overlay blockSize:0} overlay_0-515:{mountpoint:/var/lib/containers/storage/overlay/369ea1cb2430a7b1fa2b8c550794ba1b3effed2eac3c1ea9084460ec8ae32bb9/merged major:0 minor:515 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/35af3f8ab2771013547157d21bcccd709c488862e23a755ae2ed993a83390144/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-520:{mountpoint:/var/lib/containers/storage/overlay/7ef5815c3784ca4c854f63bd0cbb56765d3c9ed01b2ae704dc5420a79a15d5e4/merged major:0 minor:520 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/3c8a9e17a9b59687215af8cbb8b283cfbc26eb491a5aeb45ba2b242c3fe34e11/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-525:{mountpoint:/var/lib/containers/storage/overlay/5aeeb031b7c583df5826fbfb5d85c6f21f9a103152dce05af81d0daadd09a675/merged major:0 minor:525 fsType:overlay blockSize:0} overlay_0-529:{mountpoint:/var/lib/containers/storage/overlay/7e26302f98108634dbfd32b6306b9299eae41ef350e86236704fcdf77fc5301a/merged major:0 minor:529 fsType:overlay blockSize:0} overlay_0-535:{mountpoint:/var/lib/containers/storage/overlay/64fa0651bb4b8248b650fc028b6155989192c5af659f3667e165cdd903c275ac/merged major:0 minor:535 fsType:overlay blockSize:0} overlay_0-540:{mountpoint:/var/lib/containers/storage/overlay/da0069909f17cec2fe6adc4b38f06a471af1566902f07d95f4b0db9631d98d2e/merged major:0 minor:540 fsType:overlay blockSize:0} overlay_0-548:{mountpoint:/var/lib/containers/storage/overlay/3672bc3ffb511a7c2f7e8c60e975214f8e3d6f846e17f8fe3d6280bb1e2df2e1/merged major:0 minor:548 fsType:overlay blockSize:0} overlay_0-552:{mountpoint:/var/lib/containers/storage/overlay/5bedcfe6bcb4bf0dff4decf524dd9503e0f05aa92f2330c780dc4a4a384c21d9/merged major:0 minor:552 fsType:overlay blockSize:0} overlay_0-556:{mountpoint:/var/lib/containers/storage/overlay/ebf361e3c944e55048de7457593615764e3c01355ea06d823b3f9a46859a2b8c/merged major:0 minor:556 fsType:overlay blockSize:0} overlay_0-558:{mountpoint:/var/lib/containers/storage/overlay/5e37ca498d47021d4338a877f72973452a5d3955bd097f2662b3f339086ee6d5/merged major:0 minor:558 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/72d7a3701ba3ba27c47276741c6a79bb2abeae81ae36c32502b637a3921b5010/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-577:{mountpoint:/var/lib/containers/storage/overlay/3d8bf1bafdf01f5dae6553da84f692cb58192d822dadd4529e284f0f337c12c3/merged major:0 minor:577 fsType:overlay blockSize:0} overlay_0-580:{mountpoint:/var/lib/containers/storage/overlay/a76e379edb9f32e17adbcbcf6149d45f86ef28fa27d9257b5723d538894c3c68/merged major:0 minor:580 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/28de7bec1a409f75e2b93c9f2c19b29ac95ea4778c28e02079796967ed0b702d/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-600:{mountpoint:/var/lib/containers/storage/overlay/0c9dbbbadbe39214fdb4559bd28d6cbe9762bb943bed881901a940d1846545d3/merged major:0 minor:600 fsType:overlay blockSize:0} overlay_0-604:{mountpoint:/var/lib/containers/storage/overlay/1ef8eb55e05213875fdb6e4a741edf55a24f0409f4778b45caf23df7afea7f73/merged major:0 minor:604 fsType:overlay blockSize:0} overlay_0-606:{mountpoint:/var/lib/containers/storage/overlay/69b966f69a5d17c61cabb1463c48066d361509936880720bfa2cafdbd0ef01f5/merged major:0 minor:606 fsType:overlay blockSize:0} overlay_0-609:{mountpoint:/var/lib/containers/storage/overlay/3741b37a938bb8c2f535b316274f9b30f0e622fb623cf809804b189f77e43656/merged major:0 minor:609 fsType:overlay blockSize:0} overlay_0-614:{mountpoint:/var/lib/containers/storage/overlay/6952bc54d4fc609ebd54db087174c52ce146d61ac64bcd8798b38c53620790b7/merged major:0 minor:614 fsType:overlay blockSize:0} overlay_0-620:{mountpoint:/var/lib/containers/storage/overlay/435940ae0ad2868e0a7ca4972c6a53f683736f5230a4d67447b981519138f2be/merged major:0 minor:620 fsType:overlay blockSize:0} overlay_0-622:{mountpoint:/var/lib/containers/storage/overlay/fa8f910c18c2c290d6a3142ffb29e41e3cf0aaa49aab0f174a1d9080310470dd/merged major:0 minor:622 fsType:overlay blockSize:0} overlay_0-625:{mountpoint:/var/lib/containers/storage/overlay/e5744259a0a565f7d22505e1eecf6a2bb12bc2d278e9f1e177f80c42a2106791/merged major:0 minor:625 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/620d97164eab4b3fff415d00ac1fae4fcaa6bf9b3165232f7913c82e18140a22/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-655:{mountpoint:/var/lib/containers/storage/overlay/36df0ea16a45d5ef6b6abe4754b5acd3edeb5545d9e1d6b0d7adbd697e4b7bb6/merged major:0 minor:655 fsType:overlay blockSize:0} overlay_0-657:{mountpoint:/var/lib/containers/storage/overlay/f807788a7152ea95baf8c7c5c79416313ccc885d7d54d5cb455cd9ee561d48a4/merged major:0 minor:657 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/0565416f5577af57846a9a39d30a797d6393335c02c148833b9d4483c38ecf5b/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-660:{mountpoint:/var/lib/containers/storage/overlay/fa5b206f64107ac802bef56b88e09fcdbf8883f9f8a530eb183534d8cf62c022/merged major:0 minor:660 fsType:overlay blockSize:0} overlay_0-662:{mountpoint:/var/lib/containers/storage/overlay/72ecda17ef147678bf5f00eafd7cec79c874a531daebca189305ff5cd5c82a22/merged major:0 minor:662 fsType:overlay blockSize:0} overlay_0-664:{mountpoint:/var/lib/containers/storage/overlay/208ae492d41bef4a5d953a86b9311b56bda541b3fd088b3209cc8e2d28b8220d/merged major:0 minor:664 fsType:overlay blockSize:0} overlay_0-681:{mountpoint:/var/lib/containers/storage/overlay/eeb195781a3691885b78c37de68e01e397e045ba0c8f768e4134ce11ea26fdfa/merged major:0 minor:681 fsType:overlay blockSize:0} overlay_0-683:{mountpoint:/var/lib/containers/storage/overlay/623cbb3384b6c16692a874bf64b7f68e89645e5e17e77b84e1a72fa961d7a57e/merged major:0 minor:683 fsType:overlay blockSize:0} overlay_0-685:{mountpoint:/var/lib/containers/storage/overlay/f851d1ec3acf124e37973588ac0938e868594d7d3b536e55255ffe4b3b5e1593/merged major:0 minor:685 fsType:overlay blockSize:0} overlay_0-697:{mountpoint:/var/lib/containers/storage/overlay/fc903ac8f129e4d79b822cfdf26ccfe9bdbe8ba3a731f00669ab1456999e4e98/merged major:0 minor:697 fsType:overlay blockSize:0} overlay_0-700:{mountpoint:/var/lib/containers/storage/overlay/95453a0f6c0ab0f33bf9ae5106e47cbaf425988d0e31bed34324cf4fb4543304/merged major:0 minor:700 fsType:overlay blockSize:0} overlay_0-705:{mountpoint:/var/lib/containers/storage/overlay/3ccac8f64a3fd4c130000f69aaa166539d86159d7e3dc0c5423294f4dfcdeaa0/merged major:0 minor:705 fsType:overlay blockSize:0} overlay_0-706:{mountpoint:/var/lib/containers/storage/overlay/fd9d512e99080c667412b7da122668615fb683ef6ce0ec16312bbe6912989768/merged major:0 minor:706 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/d245613e0a3062b8d22805d9fd0a1ab88c479b89ae2b438c04aaef6fdf316a6e/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-718:{mountpoint:/var/lib/containers/storage/overlay/8585260d60163c2a8fb4fbbf857dccf0036b5843e7b017e41e0bcdcdf1a51c1c/merged major:0 minor:718 fsType:overlay blockSize:0} overlay_0-720:{mountpoint:/var/lib/containers/storage/overlay/02232d48eee464dadd31bb313bfc06b5f32d9bfee23343362667430ada2057a6/merged major:0 minor:720 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/6e83693e28ae9a6806eb09e0099ab047dd1912e9bf48507db4404382bfafd271/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-730:{mountpoint:/var/lib/containers/storage/overlay/10b27c77343a82669f04301b24466e77c271eaabfe45a3f93b87b7af88a61340/merged major:0 minor:730 fsType:overlay blockSize:0} overlay_0-732:{mountpoint:/var/lib/containers/storage/overlay/72dc47bf05f8bd91ff3771f027ddeed83f5aa4b107467e26e591005d5174ca8a/merged major:0 minor:732 fsType:overlay blockSize:0} overlay_0-737:{mountpoint:/var/lib/containers/storage/overlay/7ee1032f4bd777e7612073373d9bcf9c7efb9bdafa357b37501456eebbaf21f0/merged major:0 minor:737 fsType:overlay blockSize:0} overlay_0-742:{mountpoint:/var/lib/containers/storage/overlay/49c2190347560cd4fe11feb1398547b495548f54fac35da9e9f09c36bc412737/merged major:0 minor:742 fsType:overlay blockSize:0} overlay_0-748:{mountpoint:/var/lib/containers/storage/overlay/f0b9eba4be0148e1ee1d9b08e6dc3e1a3b4cf7f3c968d976f1216158d936bd13/merged major:0 minor:748 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/124f026298bffd3a97c4b63309edd7819188935362e954fc10e5aa24defd376b/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-760:{mountpoint:/var/lib/containers/storage/overlay/ed3264a2a02884342a444c6ca45024f29d5c96f0f8dc037a0bd1b0b477eaf384/merged major:0 minor:760 fsType:overlay blockSize:0} overlay_0-767:{mountpoint:/var/lib/containers/storage/overlay/1e571420b31b1a8a406aff994e8df681fc853b45f72b95cf4432f98c7445ebc9/merged major:0 minor:767 fsType:overlay blockSize:0} overlay_0-77:{mountpoint:/var/lib/containers/storage/overlay/0998953dc543519d46084a03e29621685d3150654c7153ef78f79a28b6b2c5c5/merged major:0 minor:77 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/a72bbe0139e8052b408a9efa43c6cf4486f8b2fe0e98574c3bbb99ac8f8d749e/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-776:{mountpoint:/var/lib/containers/storage/overlay/8548897c9355d7f234df34747d104d2d2b542856f2d0eb1f3e153c94f41436a5/merged major:0 minor:776 fsType:overlay blockSize:0} overlay_0-801:{mountpoint:/var/lib/containers/storage/overlay/82c2021a64bfece2ad1fa14cfcb0bba65cf02494130549df442b99a80f41c24f/merged major:0 minor:801 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/a744a853b32dfefd796f069645d269400bc48ee90acfad257410c5c6e4ad4405/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-824:{mountpoint:/var/lib/containers/storage/overlay/1799b23426e93fe1e1ae5cc509593d763691c74c6d8742a64571387487a45c18/merged major:0 minor:824 fsType:overlay blockSize:0} overlay_0-827:{mountpoint:/var/lib/containers/storage/overlay/b1052bfa942355c8f01f4d67cfbf62b4f700a660ea667a7574bfdf100c06bec2/merged major:0 minor:827 fsType:overlay blockSize:0} overlay_0-829:{mountpoint:/var/lib/containers/storage/overlay/7098cc6745b5f5fc5e7e49b336c7b1a03c2caa736d73939924ad17065a16c5ac/merged major:0 minor:829 fsType:overlay blockSize:0} overlay_0-835:{mountpoint:/var/lib/containers/storage/overlay/d9621e36e8c9144f174c6b3a5bfcfbdb53d0726a66a5b65774fc40d8b6467963/merged major:0 minor:835 fsType:overlay blockSize:0} overlay_0-837:{mountpoint:/var/lib/containers/storage/overlay/05b1ca488484b9277f2311f1b175713254b6bc14c7f1122221cc78c752b3dd0d/merged major:0 minor:837 fsType:overlay blockSize:0} overlay_0-840:{mountpoint:/var/lib/containers/storage/overlay/89aeb271a8e0dc41b36a5fe200ce2b8353209d6aadf2b48f86be3b4b78ce091d/merged major:0 minor:840 fsType:overlay blockSize:0} overlay_0-842:{mountpoint:/var/lib/containers/storage/overlay/d09288f793fd5ed9accea2ec8e0a66ec1ad6edf7c72d9c4bdf204cee8b7ecc56/merged major:0 minor:842 fsType:overlay blockSize:0} overlay_0-847:{mountpoint:/var/lib/containers/storage/overlay/208ce5df21a221b90e21816640bb974cd1b2bc69ceb018ae1b4678eee9e92806/merged major:0 minor:847 fsType:overlay blockSize:0} overlay_0-848:{mountpoint:/var/lib/containers/storage/overlay/fa60d60d5b580528de0899aaa95aa0644b2d49cc4800e1c18eb056c2c8bb3efb/merged major:0 minor:848 fsType:overlay blockSize:0} overlay_0-85:{mountpoint:/var/lib/containers/storage/overlay/3665421ed5e0f84a8134d179d4311e80ee7302c0626d83e562c8aa2602b6359b/merged major:0 minor:85 fsType:overlay blockSize:0} overlay_0-850:{mountpoint:/var/lib/containers/storage/overlay/60d60e76a0e0b2c21cb3a4f75f7009ca975ce1a4fbc6b8f65a025f63ef561cc5/merged major:0 minor:850 fsType:overlay blockSize:0} overlay_0-852:{mountpoint:/var/lib/containers/storage/overlay/9a3c1fc4a8b20c8c447db50b802b268c76c0db14b0697f253e49d22f74ba19c1/merged major:0 minor:852 fsType:overlay blockSize:0} overlay_0-857:{mountpoint:/var/lib/containers/storage/overlay/4feabe44217d5a65c6e97edcac6c4eb13a04b5e5d15ce7321f0170ba170c419c/merged major:0 minor:857 fsType:overlay blockSize:0} overlay_0-860:{mountpoint:/var/lib/containers/storage/overlay/857af8077a2ee592397491933347bab9da9ea38eb2f7b7ac108ff0803b605ad8/merged major:0 minor:860 fsType:overlay blockSize:0} overlay_0-861:{mountpoint:/var/lib/containers/storage/overlay/1bb70f3c6c493a9227fce79445b2e7fab8a9c1956abf42c9c60805ccd12ff9d8/merged major:0 minor:861 fsType:overlay blockSize:0} overlay_0-864:{mountpoint:/var/lib/containers/storage/overlay/8283413bc046615dc465e11d3705dece7425df10ab08395c464c0fb22361fde6/merged major:0 minor:864 fsType:overlay blockSize:0} overlay_0-881:{mountpoint:/var/lib/containers/storage/overlay/9b8ef958553b410783fd507cd01e6ab105a670acf32d1714112628c56a681ac0/merged major:0 minor:881 fsType:overlay blockSize:0} overlay_0-883:{mountpoint:/var/lib/containers/storage/overlay/d18af215fb8f3aebd209e60382ca33660d81949c16039aebc99d7be1c11271c5/merged major:0 minor:883 fsType:overlay blockSize:0} overlay_0-895:{mountpoint:/var/lib/containers/storage/overlay/00b7c89feb2fdf345fbe0bc659e6358b0d12f8f59f25bb5858c624ee1ec4fbb7/merged major:0 minor:895 fsType:overlay blockSize:0} overlay_0-900:{mountpoint:/var/lib/containers/storage/overlay/b9b128ca26d1a833cf4d1bd0fbb8faa565335599908d7f96f5936e3fb5c0f5aa/merged major:0 minor:900 fsType:overlay blockSize:0} overlay_0-903:{mountpoint:/var/lib/containers/storage/overlay/0ff6af75ae2f0a6b470bb262214c25fb2e7f5da9ad21d1a21f14d4f752e8e32e/merged major:0 minor:903 fsType:overlay blockSize:0} overlay_0-908:{mountpoint:/var/lib/containers/storage/overlay/b8b939a2f3e4739835d6434bf7f0e79c860d22442c2e947268e7bc14b4468c8c/merged major:0 minor:908 fsType:overlay blockSize:0} overlay_0-917:{mountpoint:/var/lib/containers/storage/overlay/f18f216beadf2523258daa395418223efe07c1732cc88696a01667ef9651aa75/merged major:0 minor:917 fsType:overlay blockSize:0} overlay_0-923:{mountpoint:/var/lib/containers/storage/overlay/423619c7daaa25428ec2f83a2311621a22457e192556637871015c8b01e7d6df/merged major:0 minor:923 fsType:overlay blockSize:0} overlay_0-925:{mountpoint:/var/lib/containers/storage/overlay/11ccdf0d535eb4632d9ac3a64033c3ac95fcc5613016e4ede23c7d1695cef251/merged major:0 minor:925 fsType:overlay blockSize:0} overlay_0-93:{mountpoint:/var/lib/containers/storage/overlay/25c08a51a8db01383471260d86786ba4b5ed13da48a74eca022c3ed5519f00d7/merged major:0 minor:93 fsType:overlay blockSize:0} overlay_0-933:{mountpoint:/var/lib/containers/storage/overlay/8443ed6a46864bf4f5e3c76b1d1290f491f3ba368d5a8f59cb20c90f02684839/merged major:0 minor:933 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/ec0900ab3eb05ea93c039834024496f79fb3a92342403d7ec1f84d0b857dcf42/merged major:0 minor:96 fsType:overlay blockSize:0}] Mar 19 09:24:42.752344 master-0 kubenswrapper[15202]: I0319 09:24:42.751285 15202 manager.go:217] Machine: {Timestamp:2026-03-19 09:24:42.750554574 +0000 UTC m=+0.135969410 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:81766350eaa9426e82b63b9a7bdd6612 SystemUUID:81766350-eaa9-426e-82b6-3b9a7bdd6612 BootID:183da118-c1b7-4287-af5d-a72bb0b1fda1 Filesystems:[{Device:/var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~projected/kube-api-access-pvq8m DeviceMajor:0 DeviceMinor:235 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/dc9945ac-4041-4120-b504-a173c2bf91bd/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:319 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-700 DeviceMajor:0 DeviceMinor:700 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-705 DeviceMajor:0 DeviceMinor:705 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/141cb120-92da-4d8d-bc29-fc4c433a6336/volumes/kubernetes.io~projected/kube-api-access-fhwd7 DeviceMajor:0 DeviceMinor:793 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-827 DeviceMajor:0 DeviceMinor:827 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-473 DeviceMajor:0 DeviceMinor:473 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:592 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e9ebcecb-c210-434e-83a1-825265e206f1/volumes/kubernetes.io~projected/kube-api-access-txxpw DeviceMajor:0 DeviceMinor:110 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-128 DeviceMajor:0 DeviceMinor:128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~projected/kube-api-access-4mvqh DeviceMajor:0 DeviceMinor:242 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-392 DeviceMajor:0 DeviceMinor:392 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0cb70a30-a8d1-4037-81e6-eb4f0510a234/volumes/kubernetes.io~projected/kube-api-access-q7x89 DeviceMajor:0 DeviceMinor:792 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-900 DeviceMajor:0 DeviceMinor:900 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~projected/kube-api-access-m4rtm DeviceMajor:0 DeviceMinor:227 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~projected/kube-api-access-qv8vk DeviceMajor:0 DeviceMinor:239 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-324 DeviceMajor:0 DeviceMinor:324 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0330a59f41759e27852ba986a4baf743b19081a50c60d6d41faa02679af6ba74/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:449 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-143 DeviceMajor:0 DeviceMinor:143 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-614 DeviceMajor:0 DeviceMinor:614 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8aa0f17a-287e-4a19-9a59-4913e7707071/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:102 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:813 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:450 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/80d062ef94202681a2ce48ec78dd0d061be254ea195e94ee6d413e4b7859e9f7/userdata/shm DeviceMajor:0 DeviceMinor:618 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-748 DeviceMajor:0 DeviceMinor:748 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-718 DeviceMajor:0 DeviceMinor:718 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e927634c086b213dabea9f29d0f72c001d183cc08e2e3143c01e4374d3854c57/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-600 DeviceMajor:0 DeviceMinor:600 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-620 DeviceMajor:0 DeviceMinor:620 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:100 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-77 DeviceMajor:0 DeviceMinor:77 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c2a16f6f-437c-4da5-a797-287e5e1ddbd4/volumes/kubernetes.io~projected/kube-api-access-ws5kr DeviceMajor:0 DeviceMinor:794 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-145 DeviceMajor:0 DeviceMinor:145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/31742478-0d83-48cf-b38b-02416d95d4a8/volumes/kubernetes.io~projected/kube-api-access-wz7d6 DeviceMajor:0 DeviceMinor:791 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~projected/kube-api-access-4n2hg DeviceMajor:0 DeviceMinor:125 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~projected/kube-api-access-mlwd5 DeviceMajor:0 DeviceMinor:234 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-697 DeviceMajor:0 DeviceMinor:697 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-776 DeviceMajor:0 DeviceMinor:776 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:225 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-829 DeviceMajor:0 DeviceMinor:829 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/53283035354bf0bf6eb6445cc3c068855fafc22ec51a56ba7f55c8fa85679204/userdata/shm DeviceMajor:0 DeviceMinor:103 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/83f05b1eef52787aaeaed1465a46122a61b271c0e893c29d510caa22b344a675/userdata/shm DeviceMajor:0 DeviceMinor:390 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-540 DeviceMajor:0 DeviceMinor:540 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ebe3fb9cab9178261c34fb487eaacac7fa326d405ced605571d043522371ecf/userdata/shm DeviceMajor:0 DeviceMinor:597 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3c98a716df7d2169b5450bc2ba979c5ce34c8e642ecf67690bf3ddb21407dcb5/userdata/shm DeviceMajor:0 DeviceMinor:658 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-860 DeviceMajor:0 DeviceMinor:860 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-720 DeviceMajor:0 DeviceMinor:720 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:146 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a823c8bc-09ef-46a9-a1f3-155a34b89788/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-443 DeviceMajor:0 DeviceMinor:443 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-42 DeviceMajor:0 DeviceMinor:42 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6b70c6219cee771d6e858549f53b5dbf8004794c49061a1d0481404af45e4772/userdata/shm DeviceMajor:0 DeviceMinor:772 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-848 DeviceMajor:0 DeviceMinor:848 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-494 DeviceMajor:0 DeviceMinor:494 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-681 DeviceMajor:0 DeviceMinor:681 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-498 DeviceMajor:0 DeviceMinor:498 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f1943401-a75b-4e45-8c65-3cc36018d8c4/volumes/kubernetes.io~projected/kube-api-access-8cxfs DeviceMajor:0 DeviceMinor:863 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb/userdata/shm DeviceMajor:0 DeviceMinor:52 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes/kubernetes.io~projected/kube-api-access-sqzn8 DeviceMajor:0 DeviceMinor:117 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c2a16f6f-437c-4da5-a797-287e5e1ddbd4/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:780 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ea7b959f1f38ada41e7e0e02144ed467ea210e2ba2cad4925d9240f293900cfc/userdata/shm DeviceMajor:0 DeviceMinor:804 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-730 DeviceMajor:0 DeviceMinor:730 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-732 DeviceMajor:0 DeviceMinor:732 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-895 DeviceMajor:0 DeviceMinor:895 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-264 DeviceMajor:0 DeviceMinor:264 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/31742478-0d83-48cf-b38b-02416d95d4a8/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:782 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/98ecc4ed5c1f4462a0059691baf2fae0f1530be7e7fe30902c8e9496f5a61687/userdata/shm DeviceMajor:0 DeviceMinor:86 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:226 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ff5a0a197bf95ecb5a67e95941757ff6d6a3452f584796b840c247d5169547c/userdata/shm DeviceMajor:0 DeviceMinor:247 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a2636e526bcfbc78b08fe21bedc259f6d8d2021eb2dd29e3a9e4f0bc9ba01bc2/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f8fdab32-4e61-4e9c-a506-52121f625669/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:648 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/633fccc65fe5856fecc01dbcc7e58f5190eed4eb98e5e73385a0e9bcc0746e0e/userdata/shm DeviceMajor:0 DeviceMinor:799 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b2898746-6827-41d9-ac88-64206cb84ac9/volumes/kubernetes.io~projected/kube-api-access-x9zg8 DeviceMajor:0 DeviceMinor:148 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~projected/kube-api-access-n6zkv DeviceMajor:0 DeviceMinor:238 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e09725c2-45c6-4a60-b817-6e5316d6f8e8/volumes/kubernetes.io~projected/kube-api-access-b49lj DeviceMajor:0 DeviceMinor:244 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e3376275-294d-446d-9b4c-930df60dba01/volumes/kubernetes.io~projected/kube-api-access-cgsm7 DeviceMajor:0 DeviceMinor:383 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~projected/kube-api-access-gzntq DeviceMajor:0 DeviceMinor:491 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207/userdata/shm DeviceMajor:0 DeviceMinor:385 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-50 DeviceMajor:0 DeviceMinor:50 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d52fa1ad-0071-4506-bb94-e73d6f15a75c/volumes/kubernetes.io~projected/kube-api-access-xvg4q DeviceMajor:0 DeviceMinor:608 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-515 DeviceMajor:0 DeviceMinor:515 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9770dec3c5a68dfe57ae44a071cba876d6de2453aaa07c370070045080879209/userdata/shm DeviceMajor:0 DeviceMinor:823 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f93b8728-4a33-4ee4-b7c6-cff7d7995953/volumes/kubernetes.io~projected/kube-api-access-kfw5k DeviceMajor:0 DeviceMinor:789 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/140ed89f4b9b2b1bc53aaf33f301b87dd30e7536da0e060bc21d49ac11b53d25/userdata/shm DeviceMajor:0 DeviceMinor:812 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/db42b38e-294e-4016-8ac1-54126ac60de8/volumes/kubernetes.io~projected/kube-api-access-8dwx6 DeviceMajor:0 DeviceMinor:487 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes/kubernetes.io~projected/kube-api-access-2p6wn DeviceMajor:0 DeviceMinor:116 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-706 DeviceMajor:0 DeviceMinor:706 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:94 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-552 DeviceMajor:0 DeviceMinor:552 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-923 DeviceMajor:0 DeviceMinor:923 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d486ce23-acf7-429a-9739-4770e1a2bf78/volumes/kubernetes.io~projected/kube-api-access-bzdjs DeviceMajor:0 DeviceMinor:340 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~projected/kube-api-access-bgmwd DeviceMajor:0 DeviceMinor:233 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~projected/kube-api-access-fsdjh DeviceMajor:0 DeviceMinor:817 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/91645cb3e01b7383bf3c741eaf023e22432d4ae51de307f2749e304f203b0c13/userdata/shm DeviceMajor:0 DeviceMinor:818 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-840 DeviceMajor:0 DeviceMinor:840 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a75049de-dcf1-4102-b339-f45d5015adea/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~projected/kube-api-access-7k8wj DeviceMajor:0 DeviceMinor:224 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/477e80a326e15b39b7b82ddbd8c611ce0d975cfa79226bc1c3506b7ace234991/userdata/shm DeviceMajor:0 DeviceMinor:459 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-837 DeviceMajor:0 DeviceMinor:837 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4316d2211dbd015d08696560a401855badad02d5162bd18e5f9b36a4aa80b6a7/userdata/shm DeviceMajor:0 DeviceMinor:80 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-925 DeviceMajor:0 DeviceMinor:925 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9f7751b6243f5b55d5db7507e92a7214e3b051f064f66c13d1a6b5d546c577a0/userdata/shm DeviceMajor:0 DeviceMinor:461 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3580cc8aaddf6d9ceec4e9655520a84a1d14647aea74906c068b15c17cd230e2/userdata/shm DeviceMajor:0 DeviceMinor:466 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-481 DeviceMajor:0 DeviceMinor:481 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-485 DeviceMajor:0 DeviceMinor:485 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-842 DeviceMajor:0 DeviceMinor:842 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/467c2f01-2c23-41e2-acb9-08a84061fefc/volumes/kubernetes.io~projected/kube-api-access-mxtcq DeviceMajor:0 DeviceMinor:890 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7/volumes/kubernetes.io~projected/kube-api-access-n49x9 DeviceMajor:0 DeviceMinor:769 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:781 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-606 DeviceMajor:0 DeviceMinor:606 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-525 DeviceMajor:0 DeviceMinor:525 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-477 DeviceMajor:0 DeviceMinor:477 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/60e2341ea52796910cf576444e31843d25e05b8bd2f74cb2b05f4a3b9dd9259c/userdata/shm DeviceMajor:0 DeviceMinor:803 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f93b8728-4a33-4ee4-b7c6-cff7d7995953/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:787 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d/userdata/shm DeviceMajor:0 DeviceMinor:53 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-386 DeviceMajor:0 DeviceMinor:386 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-662 DeviceMajor:0 DeviceMinor:662 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3b84ff6dcb01c2864416447de1ea9c58a9ceb02e0ee8e948fe0ed652019990a3/userdata/shm DeviceMajor:0 DeviceMinor:330 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-657 DeviceMajor:0 DeviceMinor:657 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d32541c9-eef6-417c-9f5a-a7392dc70aa0/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:795 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-767 DeviceMajor:0 DeviceMinor:767 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:778 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d90f590a-6118-4769-b18f-fec67dd62c20/volumes/kubernetes.io~projected/kube-api-access-nljb2 DeviceMajor:0 DeviceMinor:388 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d486ce23-acf7-429a-9739-4770e1a2bf78/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:230 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-835 DeviceMajor:0 DeviceMinor:835 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-416 DeviceMajor:0 DeviceMinor:416 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/50a1314f615289e97846876591738257b69b7371d4d5221e4bff4229ac719500/userdata/shm DeviceMajor:0 DeviceMinor:468 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-580 DeviceMajor:0 DeviceMinor:580 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-85 DeviceMajor:0 DeviceMinor:85 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~projected/kube-api-access-npxz5 DeviceMajor:0 DeviceMinor:250 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/141cb120-92da-4d8d-bc29-fc4c433a6336/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:783 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/dea35f60-33be-4ccc-b985-952eac3a85c0/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:91 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b8f39c16-3a94-45c3-a51c-f2e81eff967d/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:579 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-664 DeviceMajor:0 DeviceMinor:664 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ad26131feed8f843d54fc530ac76fa79da18fccdc922829d24cd94f163dc8c43/userdata/shm DeviceMajor:0 DeviceMinor:126 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0199cad4d2d40a08764f1663de391bde31e6d871787f072d972f01e6e0efed56/userdata/shm DeviceMajor:0 DeviceMinor:141 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~projected/kube-api-access-wdmtg DeviceMajor:0 DeviceMinor:229 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/821c52c85783248914e1cb66a5226574cf37830c4faa0aeaafaba66f8e77d10e/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c1983dec9f8f8a439e5c314e9b1a25b285c9dab87a4b4ab4ebf43300415e5937/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d32541c9-eef6-417c-9f5a-a7392dc70aa0/volumes/kubernetes.io~projected/kube-api-access-fvp9m DeviceMajor:0 DeviceMinor:796 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~projected/kube-api-access-fp46p DeviceMajor:0 DeviceMinor:135 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f216606b-43d0-43d0-a3e3-a3ee2952e7b8/volumes/kubernetes.io~projected/kube-api-access-bd8nz DeviceMajor:0 DeviceMinor:240 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dc9945ac-4041-4120-b504-a173c2bf91bd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:318 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-824 DeviceMajor:0 DeviceMinor:824 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/5b36f3b2-caf9-40ad-a3a1-e83796142f54/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/310d604b-fe9a-4b19-b8b5-7a1983e45e67/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b892eee40a06455829cc81eb6e0dd169b807324d4739ad655ee0ca9fb5c8714e/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-400 DeviceMajor:0 DeviceMinor:400 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33e92e5d-61ea-45b2-b357-ebffdaebf4af/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:451 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-558 DeviceMajor:0 DeviceMinor:558 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~projected/kube-api-access-rnfsx DeviceMajor:0 DeviceMinor:586 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3e2c8814ccc98cac7a5efa0d10dc83cf2e61b0e64624f6788df1c797834583a5/userdata/shm DeviceMajor:0 DeviceMinor:679 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4256d841-23cb-4756-b827-f44ee6e54def/volumes/kubernetes.io~projected/kube-api-access-ptcvr DeviceMajor:0 DeviceMinor:123 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/53ca7c2bbb876011f402aae31332c23a88cd129f0338e1a2144855ba74feb02e/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:299 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2f229196290719614f7bbcbd70dc1d3eb6df4440414271052c0e25cb9764e057/userdata/shm DeviceMajor:0 DeviceMinor:695 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-359 DeviceMajor:0 DeviceMinor:359 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f0d16aa2-494d-4a65-880d-3d87219178b5/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:814 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-165 DeviceMajor:0 DeviceMinor:165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-401 DeviceMajor:0 DeviceMinor:401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:585 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-334 DeviceMajor:0 DeviceMinor:334 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/89be0036-a2c8-48b4-9eaf-17fab972c4f4/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:754 Capacity:200003584 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4c8b86557bfc322b9f1b1feea17aaefaf34263c41685f5164a347ec08c589e8/userdata/shm DeviceMajor:0 DeviceMinor:764 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1a7fdb4a43ba25a5a6578015b2edb59a1883c19bd038650fc1b7e8bb9f8cb9fe/userdata/shm DeviceMajor:0 DeviceMinor:788 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/96902651-8e2b-44c2-be80-0a8c7c28cb58/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:134 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-335 DeviceMajor:0 DeviceMinor:335 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4d0ada47f0cb160d98966d63d8a86c801dfaedca21b5932c03647c7678f530ef/userdata/shm DeviceMajor:0 DeviceMinor:457 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/volumes/kubernetes.io~projected/kube-api-access-qh4t8 DeviceMajor:0 DeviceMinor:237 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:454 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:455 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-655 DeviceMajor:0 DeviceMinor:655 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f7e0d1fae2c29d1550044dbfbc303fe4f5bb6dc47066c479df51113017952abe/userdata/shm DeviceMajor:0 DeviceMinor:465 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-326 DeviceMajor:0 DeviceMinor:326 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-857 DeviceMajor:0 DeviceMinor:857 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-908 DeviceMajor:0 DeviceMinor:908 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:472 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:95 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dcd974bd0f0964a65ab29ef3997c50dfd49fb09b0d23f80973950611596b8b0a/userdata/shm DeviceMajor:0 DeviceMinor:464 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-801 DeviceMajor:0 DeviceMinor:801 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~projected/kube-api-access-qn48v DeviceMajor:0 DeviceMinor:232 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/75f211854713a8c265774fbfcbda7e10ba7bc52775fdc4cf5a9c7e3a17e4fafc/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-903 DeviceMajor:0 DeviceMinor:903 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/583856dbacb5dc5e9529b9ca02e0d5f443ece406b459258f60f347711cce62fd/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-380 DeviceMajor:0 DeviceMinor:380 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-861 DeviceMajor:0 DeviceMinor:861 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/86c4b0e4-3481-465d-b00f-022d2c58c183/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~projected/kube-api-access-vr9dj DeviceMajor:0 DeviceMinor:595 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-683 DeviceMajor:0 DeviceMinor:683 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-864 DeviceMajor:0 DeviceMinor:864 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1038c6a91bb0394b9f1a3e92f46c92dba250febe4aaf879093674a3b7750a66e/userdata/shm DeviceMajor:0 DeviceMinor:878 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/62125637029a850812cbf1a1551ac9bf8a2431cbf9d2111e28185931308bf215/userdata/shm DeviceMajor:0 DeviceMinor:458 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-520 DeviceMajor:0 DeviceMinor:520 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/208939f5-8fca-4fd5-b0c6-43484b7d1e30/volumes/kubernetes.io~projected/kube-api-access-lktk8 DeviceMajor:0 DeviceMinor:253 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f3d541255d94e7adb76b98fa9b12f1b4b02507d8361efa09820cd8f3dca7ff37/userdata/shm DeviceMajor:0 DeviceMinor:308 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-441 DeviceMajor:0 DeviceMinor:441 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/083882c0-ea2f-4405-8cf1-cce5b91fe602/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16baf9775f985e1b480f87b400eeeae8104d091a33d3bd5c1b39213f99e3a679/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-660 DeviceMajor:0 DeviceMinor:660 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e06198d02d459687a486cc15adc7dc083fe3318eadd2b488354e24ff71bc8330/userdata/shm DeviceMajor:0 DeviceMinor:408 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-535 DeviceMajor:0 DeviceMinor:535 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9ca444a4-4d78-456f-9656-0c28076ce77e/volumes/kubernetes.io~projected/kube-api-access-kt22g DeviceMajor:0 DeviceMinor:786 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9663cc40-a69d-42ba-890e-071cb85062f5/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:223 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d90f590a-6118-4769-b18f-fec67dd62c20/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:387 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/db42b38e-294e-4016-8ac1-54126ac60de8/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:478 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-604 DeviceMajor:0 DeviceMinor:604 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:101 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-850 DeviceMajor:0 DeviceMinor:850 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-414 DeviceMajor:0 DeviceMinor:414 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-419 DeviceMajor:0 DeviceMinor:419 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:594 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-496 DeviceMajor:0 DeviceMinor:496 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-502 DeviceMajor:0 DeviceMinor:502 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/kube-api-access-548cd DeviceMajor:0 DeviceMinor:241 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/daa8d078ddf9aa01cf01bb6323c8070a780d54a28938469ce42348c764525db1/userdata/shm DeviceMajor:0 DeviceMinor:378 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/78ed7df2de04c4d9012bf3b0bae0730cc7f525024f23a27fe0e47c32e46b41f6/userdata/shm DeviceMajor:0 DeviceMinor:488 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210/userdata/shm DeviceMajor:0 DeviceMinor:107 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9ca444a4-4d78-456f-9656-0c28076ce77e/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:779 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:243 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/948a9c37f749c61db89536b56fc5ebfbd9515fadb98fb3cf0cd9cfac9adb0c7f/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-622 DeviceMajor:0 DeviceMinor:622 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-737 DeviceMajor:0 DeviceMinor:737 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes/kubernetes.io~projected/kube-api-access-w6qs5 DeviceMajor:0 DeviceMinor:790 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:785 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/872e5f8c-b014-4283-a4d2-0e2cfd29e192/volumes/kubernetes.io~projected/kube-api-access-kfpv6 DeviceMajor:0 DeviceMinor:43 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b2bff8a5-c45d-4d28-8771-2239ad0fa578/volumes/kubernetes.io~projected/kube-api-access-s2ntw DeviceMajor:0 DeviceMinor:500 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:114 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ce25d0833a4b4914270f5d82edb7a1d2046516be1c792659a8b92bdeaf1ab42/userdata/shm DeviceMajor:0 DeviceMinor:147 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-312 DeviceMajor:0 DeviceMinor:312 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-504 DeviceMajor:0 DeviceMinor:504 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-760 DeviceMajor:0 DeviceMinor:760 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/467c2f01-2c23-41e2-acb9-08a84061fefc/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:889 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-917 DeviceMajor:0 DeviceMinor:917 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/357980ba-1957-412f-afb5-04281eca2bee/volumes/kubernetes.io~projected/ku Mar 19 09:24:42.752909 master-0 kubenswrapper[15202]: be-api-access-8zvxj DeviceMajor:0 DeviceMinor:228 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-483 DeviceMajor:0 DeviceMinor:483 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88dd8210417d34cd695549010f86bdfe2541add1af48e0e0b07c7ed8f524f103/userdata/shm DeviceMajor:0 DeviceMinor:514 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-438 DeviceMajor:0 DeviceMinor:438 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dd1819a433e70ea4c2b01b165e8a76f6644d7959ff5dbef7efb1f362b56038c1/userdata/shm DeviceMajor:0 DeviceMinor:300 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-529 DeviceMajor:0 DeviceMinor:529 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-556 DeviceMajor:0 DeviceMinor:556 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-685 DeviceMajor:0 DeviceMinor:685 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/39bf78ac-304b-4b82-8729-d184657ef3bb/volumes/kubernetes.io~projected/kube-api-access-rltcj DeviceMajor:0 DeviceMinor:806 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5ae9a109908423db1f7d35a526931ed2af44da77833edc3112c7f12de82644eb/userdata/shm DeviceMajor:0 DeviceMinor:808 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823/volumes/kubernetes.io~projected/kube-api-access-ft9rs DeviceMajor:0 DeviceMinor:99 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a8e2194-aba6-4929-a29c-47c63c8ff799/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:259 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9bea8e39775551acb259adea0fc4cfd103c16875f290afb2712a31409a51f01c/userdata/shm DeviceMajor:0 DeviceMinor:382 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-418 DeviceMajor:0 DeviceMinor:418 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/58d5b64552b14fa37f1c4ade1890dfcbcf78def52cdf495457e904377a1b0a43/userdata/shm DeviceMajor:0 DeviceMinor:758 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1dd59466-0133-41fe-a648-28db73aa861b/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:490 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-847 DeviceMajor:0 DeviceMinor:847 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-933 DeviceMajor:0 DeviceMinor:933 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f64ca30d2cf598d32dbab617a0a172e7aa2a1cb9512109dd3142530e06881cb4/userdata/shm DeviceMajor:0 DeviceMinor:816 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-314 DeviceMajor:0 DeviceMinor:314 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ece5177b-ae15-4c33-a8d4-612ab50b2b8b/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:456 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b8f39c16-3a94-45c3-a51c-f2e81eff967d/volumes/kubernetes.io~projected/kube-api-access-qmdlx DeviceMajor:0 DeviceMinor:591 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bf966775e86bfc1949a8a9f5db81e0ee9b20d3350007cbf3457786493d88b741/userdata/shm DeviceMajor:0 DeviceMinor:596 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cd42096c-f18d-4bb5-8a51-8761dc1edb73/volumes/kubernetes.io~projected/kube-api-access-dxdb6 DeviceMajor:0 DeviceMinor:798 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4c44a8a218f4d3a8bf81e0a2e78942dceac9d2b2c4c60ba4ca23a60c107ed3b/userdata/shm DeviceMajor:0 DeviceMinor:821 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-881 DeviceMajor:0 DeviceMinor:881 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a417fe25-4aca-471c-941d-c195b6141042/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:452 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-548 DeviceMajor:0 DeviceMinor:548 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3a4fd337-c385-4f56-965c-d68ee0a4e848/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:593 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc/volumes/kubernetes.io~projected/kube-api-access-2zz2n DeviceMajor:0 DeviceMinor:363 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-475 DeviceMajor:0 DeviceMinor:475 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-742 DeviceMajor:0 DeviceMinor:742 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-306 DeviceMajor:0 DeviceMinor:306 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-396 DeviceMajor:0 DeviceMinor:396 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7fda0d28-6511-4577-9cd3-58a9c1a64d4e/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:584 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0cb70a30-a8d1-4037-81e6-eb4f0510a234/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:784 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/307605e6-d1cf-4172-8e7d-918c435f3577/volumes/kubernetes.io~projected/kube-api-access-wrs54 DeviceMajor:0 DeviceMinor:303 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/15eda3bde3926ace98dc82fe5b6fb4d1ace5d01b315a5e6ece92e5b50ae9132e/userdata/shm DeviceMajor:0 DeviceMinor:304 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f8fdab32-4e61-4e9c-a506-52121f625669/volumes/kubernetes.io~projected/kube-api-access-5xl5z DeviceMajor:0 DeviceMinor:678 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-852 DeviceMajor:0 DeviceMinor:852 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bec90db1-02e3-4211-8c33-f8bcc304e3a7/volumes/kubernetes.io~projected/kube-api-access-nr5cd DeviceMajor:0 DeviceMinor:245 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9ac42112-6a00-4c17-b230-75b565aa668f/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:453 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-577 DeviceMajor:0 DeviceMinor:577 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-883 DeviceMajor:0 DeviceMinor:883 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1f2148fe-f9f6-47da-894c-b88dae360ebe/volumes/kubernetes.io~projected/kube-api-access-47czp DeviceMajor:0 DeviceMinor:231 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-398 DeviceMajor:0 DeviceMinor:398 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:115 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0393a35bbe4b19a3e9ea308aa492673dee8523d90083f95724357dd38620d600/userdata/shm DeviceMajor:0 DeviceMinor:809 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-609 DeviceMajor:0 DeviceMinor:609 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-625 DeviceMajor:0 DeviceMinor:625 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-93 DeviceMajor:0 DeviceMinor:93 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/68b78edecfaa0767b2a9ec13b06b870fd624336582c98e3d4f8f932f455459d7/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7778d952a4063165f8bb8e547abc986e6bc52a9b3b98034ff13b26c82386c41e/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-445 DeviceMajor:0 DeviceMinor:445 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-447 DeviceMajor:0 DeviceMinor:447 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dd69fc33-59d4-4538-b4ec-e2d08ac11f72/volumes/kubernetes.io~projected/kube-api-access-txp58 DeviceMajor:0 DeviceMinor:770 Capacity:32475533312 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0330a59f41759e2 MacAddress:ce:a8:8a:51:b4:fe Speed:10000 Mtu:8900} {Name:0393a35bbe4b19a MacAddress:3a:5a:07:07:08:ec Speed:10000 Mtu:8900} {Name:1038c6a91bb0394 MacAddress:6a:d1:e5:10:37:3e Speed:10000 Mtu:8900} {Name:140ed89f4b9b2b1 MacAddress:4a:15:6b:40:06:02 Speed:10000 Mtu:8900} {Name:15eda3bde3926ac MacAddress:92:06:c4:93:dd:73 Speed:10000 Mtu:8900} {Name:16baf9775f985e1 MacAddress:02:72:ae:5f:f9:8b Speed:10000 Mtu:8900} {Name:1a7fdb4a43ba25a MacAddress:6e:6d:26:14:a5:d3 Speed:10000 Mtu:8900} {Name:2ebe3fb9cab9178 MacAddress:86:b6:0c:82:89:52 Speed:10000 Mtu:8900} {Name:2f2291962907196 MacAddress:5e:ff:e0:04:c6:0d Speed:10000 Mtu:8900} {Name:3580cc8aaddf6d9 MacAddress:86:95:6c:7e:90:9e Speed:10000 Mtu:8900} {Name:3b84ff6dcb01c28 MacAddress:12:1a:34:84:10:6d Speed:10000 Mtu:8900} {Name:3e2c8814ccc98ca MacAddress:c6:57:06:04:7f:da Speed:10000 Mtu:8900} {Name:477e80a326e15b3 MacAddress:a6:f2:a4:51:23:dd Speed:10000 Mtu:8900} {Name:4d0ada47f0cb160 MacAddress:ce:7d:09:dc:8b:8a Speed:10000 Mtu:8900} {Name:50a1314f615289e MacAddress:8a:fd:c7:b2:be:e9 Speed:10000 Mtu:8900} {Name:53ca7c2bbb87601 MacAddress:1e:fc:65:15:b0:d8 Speed:10000 Mtu:8900} {Name:58d5b64552b14fa MacAddress:7a:c8:18:23:ce:d8 Speed:10000 Mtu:8900} {Name:5995c7b8ffe029a MacAddress:ca:b0:cd:a1:2a:78 Speed:10000 Mtu:8900} {Name:5ae9a109908423d MacAddress:36:50:4e:10:ed:ff Speed:10000 Mtu:8900} {Name:60e2341ea527969 MacAddress:ce:66:bd:9e:b8:11 Speed:10000 Mtu:8900} {Name:62125637029a850 MacAddress:12:e2:9d:a0:19:31 Speed:10000 Mtu:8900} {Name:6b70c6219cee771 MacAddress:36:c0:42:52:a7:45 Speed:10000 Mtu:8900} {Name:75f211854713a8c MacAddress:ee:0a:21:76:6c:9d Speed:10000 Mtu:8900} {Name:7778d952a406316 MacAddress:ce:44:e1:04:08:30 Speed:10000 Mtu:8900} {Name:78ed7df2de04c4d MacAddress:56:34:29:2c:8a:a3 Speed:10000 Mtu:8900} {Name:821c52c85783248 MacAddress:1a:23:ae:ff:65:fc Speed:10000 Mtu:8900} {Name:83f05b1eef52787 MacAddress:9a:7b:af:28:ea:f0 Speed:10000 Mtu:8900} {Name:88dd8210417d34c MacAddress:3a:3b:80:5e:4b:77 Speed:10000 Mtu:8900} {Name:8ff5a0a197bf95e MacAddress:e2:af:17:bc:be:bc Speed:10000 Mtu:8900} {Name:91645cb3e01b738 MacAddress:4e:4e:c1:49:7d:f0 Speed:10000 Mtu:8900} {Name:948a9c37f749c61 MacAddress:b2:b9:56:45:32:6c Speed:10000 Mtu:8900} {Name:9770dec3c5a68df MacAddress:f2:9b:8a:94:2a:28 Speed:10000 Mtu:8900} {Name:9bea8e39775551a MacAddress:62:e6:13:5f:29:e9 Speed:10000 Mtu:8900} {Name:9f7751b6243f5b5 MacAddress:06:e3:3e:df:e3:04 Speed:10000 Mtu:8900} {Name:a2636e526bcfbc7 MacAddress:ea:76:86:b2:1c:e8 Speed:10000 Mtu:8900} {Name:b892eee40a06455 MacAddress:f2:b7:c1:f2:45:60 Speed:10000 Mtu:8900} {Name:bf966775e86bfc1 MacAddress:42:5c:a3:b7:30:9b Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:c6:4a:11:0c:41:f4 Speed:0 Mtu:8900} {Name:c1983dec9f8f8a4 MacAddress:66:2c:8c:3d:05:43 Speed:10000 Mtu:8900} {Name:daa8d078ddf9aa0 MacAddress:16:78:08:c0:37:df Speed:10000 Mtu:8900} {Name:dcd974bd0f0964a MacAddress:c2:92:b3:18:61:dc Speed:10000 Mtu:8900} {Name:dd1819a433e70ea MacAddress:be:36:d8:99:cb:b4 Speed:10000 Mtu:8900} {Name:e06198d02d45968 MacAddress:42:a8:24:2c:f2:1d Speed:10000 Mtu:8900} {Name:e4c44a8a218f4d3 MacAddress:2e:f8:b0:36:44:26 Speed:10000 Mtu:8900} {Name:e4c8b86557bfc32 MacAddress:16:1b:84:c2:eb:6c Speed:10000 Mtu:8900} {Name:e927634c086b213 MacAddress:7e:95:33:52:a5:ed Speed:10000 Mtu:8900} {Name:ea7b959f1f38ada MacAddress:a6:80:c1:b0:aa:45 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:d5:00:d5 Speed:-1 Mtu:9000} {Name:f3d541255d94e7a MacAddress:5e:a1:ee:b1:03:17 Speed:10000 Mtu:8900} {Name:f64ca30d2cf598d MacAddress:b6:00:0e:20:b0:88 Speed:10000 Mtu:8900} {Name:f7e0d1fae2c29d1 MacAddress:d2:62:5c:28:af:0b Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:d6:48:60:15:e4:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:24:42.752909 master-0 kubenswrapper[15202]: I0319 09:24:42.752331 15202 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:24:42.752909 master-0 kubenswrapper[15202]: I0319 09:24:42.752392 15202 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:24:42.752909 master-0 kubenswrapper[15202]: I0319 09:24:42.752652 15202 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:24:42.752909 master-0 kubenswrapper[15202]: I0319 09:24:42.752832 15202 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:24:42.753186 master-0 kubenswrapper[15202]: I0319 09:24:42.752866 15202 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:24:42.753186 master-0 kubenswrapper[15202]: I0319 09:24:42.753070 15202 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:24:42.753186 master-0 kubenswrapper[15202]: I0319 09:24:42.753078 15202 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:24:42.753186 master-0 kubenswrapper[15202]: I0319 09:24:42.753086 15202 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:24:42.753186 master-0 kubenswrapper[15202]: I0319 09:24:42.753107 15202 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:24:42.753186 master-0 kubenswrapper[15202]: I0319 09:24:42.753140 15202 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:24:42.753428 master-0 kubenswrapper[15202]: I0319 09:24:42.753215 15202 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:24:42.753428 master-0 kubenswrapper[15202]: I0319 09:24:42.753279 15202 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:24:42.753428 master-0 kubenswrapper[15202]: I0319 09:24:42.753291 15202 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:24:42.753428 master-0 kubenswrapper[15202]: I0319 09:24:42.753304 15202 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:24:42.753428 master-0 kubenswrapper[15202]: I0319 09:24:42.753316 15202 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:24:42.753428 master-0 kubenswrapper[15202]: I0319 09:24:42.753326 15202 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:24:42.754717 master-0 kubenswrapper[15202]: I0319 09:24:42.754680 15202 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 09:24:42.754808 master-0 kubenswrapper[15202]: W0319 09:24:42.754677 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:42.754808 master-0 kubenswrapper[15202]: E0319 09:24:42.754780 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:42.754905 master-0 kubenswrapper[15202]: I0319 09:24:42.754887 15202 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 09:24:42.755038 master-0 kubenswrapper[15202]: W0319 09:24:42.754919 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:42.755091 master-0 kubenswrapper[15202]: E0319 09:24:42.755061 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:42.755182 master-0 kubenswrapper[15202]: I0319 09:24:42.755163 15202 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:24:42.755302 master-0 kubenswrapper[15202]: I0319 09:24:42.755285 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755305 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755314 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755321 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755329 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755336 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755343 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:24:42.755344 master-0 kubenswrapper[15202]: I0319 09:24:42.755350 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:24:42.755626 master-0 kubenswrapper[15202]: I0319 09:24:42.755359 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:24:42.755626 master-0 kubenswrapper[15202]: I0319 09:24:42.755366 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:24:42.755626 master-0 kubenswrapper[15202]: I0319 09:24:42.755387 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:24:42.755626 master-0 kubenswrapper[15202]: I0319 09:24:42.755399 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:24:42.755626 master-0 kubenswrapper[15202]: I0319 09:24:42.755424 15202 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:24:42.755874 master-0 kubenswrapper[15202]: I0319 09:24:42.755855 15202 server.go:1280] "Started kubelet" Mar 19 09:24:42.756123 master-0 kubenswrapper[15202]: I0319 09:24:42.756066 15202 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:24:42.770639 master-0 kubenswrapper[15202]: I0319 09:24:42.756226 15202 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:24:42.770639 master-0 kubenswrapper[15202]: I0319 09:24:42.756342 15202 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 09:24:42.770639 master-0 kubenswrapper[15202]: I0319 09:24:42.756885 15202 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:42.770639 master-0 kubenswrapper[15202]: I0319 09:24:42.756936 15202 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:24:42.770639 master-0 kubenswrapper[15202]: E0319 09:24:42.756348 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e33d2945d16b3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:24:42.755831475 +0000 UTC m=+0.141246291,LastTimestamp:2026-03-19 09:24:42.755831475 +0000 UTC m=+0.141246291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:24:42.770639 master-0 kubenswrapper[15202]: I0319 09:24:42.767611 15202 server.go:449] "Adding debug handlers to kubelet server" Mar 19 09:24:42.756660 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.775783 15202 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.775839 15202 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.775964 15202 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 09:09:00 +0000 UTC, rotation deadline is 2026-03-20 02:42:28.707514718 +0000 UTC Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.776018 15202 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h17m45.931499089s for next certificate rotation Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.776021 15202 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.776088 15202 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: I0319 09:24:42.776286 15202 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 09:24:42.780764 master-0 kubenswrapper[15202]: E0319 09:24:42.776095 15202 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781513 15202 factory.go:153] Registering CRI-O factory Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781542 15202 factory.go:221] Registration of the crio container factory successfully Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781607 15202 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781616 15202 factory.go:55] Registering systemd factory Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781622 15202 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781641 15202 factory.go:103] Registering Raw factory Mar 19 09:24:42.781831 master-0 kubenswrapper[15202]: I0319 09:24:42.781654 15202 manager.go:1196] Started watching for new ooms in manager Mar 19 09:24:42.782056 master-0 kubenswrapper[15202]: E0319 09:24:42.781881 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:24:42.782056 master-0 kubenswrapper[15202]: W0319 09:24:42.781881 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:42.782056 master-0 kubenswrapper[15202]: E0319 09:24:42.781965 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:42.782162 master-0 kubenswrapper[15202]: I0319 09:24:42.782075 15202 manager.go:319] Starting recovery of all containers Mar 19 09:24:42.791978 master-0 kubenswrapper[15202]: I0319 09:24:42.791873 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a75049de-dcf1-4102-b339-f45d5015adea" volumeName="kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh" seLinuxMountContext="" Mar 19 09:24:42.792059 master-0 kubenswrapper[15202]: I0319 09:24:42.791976 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d486ce23-acf7-429a-9739-4770e1a2bf78" volumeName="kubernetes.io/projected/d486ce23-acf7-429a-9739-4770e1a2bf78-kube-api-access-bzdjs" seLinuxMountContext="" Mar 19 09:24:42.792059 master-0 kubenswrapper[15202]: I0319 09:24:42.792048 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cb70a30-a8d1-4037-81e6-eb4f0510a234" volumeName="kubernetes.io/empty-dir/0cb70a30-a8d1-4037-81e6-eb4f0510a234-snapshots" seLinuxMountContext="" Mar 19 09:24:42.792125 master-0 kubenswrapper[15202]: I0319 09:24:42.792067 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39bf78ac-304b-4b82-8729-d184657ef3bb" volumeName="kubernetes.io/projected/39bf78ac-304b-4b82-8729-d184657ef3bb-kube-api-access-rltcj" seLinuxMountContext="" Mar 19 09:24:42.792125 master-0 kubenswrapper[15202]: I0319 09:24:42.792094 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" volumeName="kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config" seLinuxMountContext="" Mar 19 09:24:42.792125 master-0 kubenswrapper[15202]: I0319 09:24:42.792124 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ca444a4-4d78-456f-9656-0c28076ce77e" volumeName="kubernetes.io/projected/9ca444a4-4d78-456f-9656-0c28076ce77e-kube-api-access-kt22g" seLinuxMountContext="" Mar 19 09:24:42.792220 master-0 kubenswrapper[15202]: I0319 09:24:42.792134 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8f39c16-3a94-45c3-a51c-f2e81eff967d" volumeName="kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls" seLinuxMountContext="" Mar 19 09:24:42.792220 master-0 kubenswrapper[15202]: I0319 09:24:42.792148 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ece5177b-ae15-4c33-a8d4-612ab50b2b8b" volumeName="kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m" seLinuxMountContext="" Mar 19 09:24:42.792220 master-0 kubenswrapper[15202]: I0319 09:24:42.792165 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1943401-a75b-4e45-8c65-3cc36018d8c4" volumeName="kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-catalog-content" seLinuxMountContext="" Mar 19 09:24:42.792220 master-0 kubenswrapper[15202]: I0319 09:24:42.792201 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f93b8728-4a33-4ee4-b7c6-cff7d7995953" volumeName="kubernetes.io/projected/f93b8728-4a33-4ee4-b7c6-cff7d7995953-kube-api-access-kfw5k" seLinuxMountContext="" Mar 19 09:24:42.792220 master-0 kubenswrapper[15202]: I0319 09:24:42.792217 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4256d841-23cb-4756-b827-f44ee6e54def" volumeName="kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs" seLinuxMountContext="" Mar 19 09:24:42.792379 master-0 kubenswrapper[15202]: I0319 09:24:42.792235 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="872e5f8c-b014-4283-a4d2-0e2cfd29e192" volumeName="kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6" seLinuxMountContext="" Mar 19 09:24:42.792379 master-0 kubenswrapper[15202]: I0319 09:24:42.792253 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8aa0f17a-287e-4a19-9a59-4913e7707071" volumeName="kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert" seLinuxMountContext="" Mar 19 09:24:42.792379 master-0 kubenswrapper[15202]: I0319 09:24:42.792272 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db42b38e-294e-4016-8ac1-54126ac60de8" volumeName="kubernetes.io/empty-dir/db42b38e-294e-4016-8ac1-54126ac60de8-cache" seLinuxMountContext="" Mar 19 09:24:42.792379 master-0 kubenswrapper[15202]: I0319 09:24:42.792283 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc9945ac-4041-4120-b504-a173c2bf91bd" volumeName="kubernetes.io/secret/dc9945ac-4041-4120-b504-a173c2bf91bd-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.792379 master-0 kubenswrapper[15202]: I0319 09:24:42.792353 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1943401-a75b-4e45-8c65-3cc36018d8c4" volumeName="kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-utilities" seLinuxMountContext="" Mar 19 09:24:42.792544 master-0 kubenswrapper[15202]: I0319 09:24:42.792397 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b36f3b2-caf9-40ad-a3a1-e83796142f54" volumeName="kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.792544 master-0 kubenswrapper[15202]: I0319 09:24:42.792469 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.792613 master-0 kubenswrapper[15202]: I0319 09:24:42.792567 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d90f590a-6118-4769-b18f-fec67dd62c20" volumeName="kubernetes.io/secret/d90f590a-6118-4769-b18f-fec67dd62c20-signing-key" seLinuxMountContext="" Mar 19 09:24:42.792613 master-0 kubenswrapper[15202]: I0319 09:24:42.792588 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="467c2f01-2c23-41e2-acb9-08a84061fefc" volumeName="kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls" seLinuxMountContext="" Mar 19 09:24:42.792756 master-0 kubenswrapper[15202]: I0319 09:24:42.792712 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client" seLinuxMountContext="" Mar 19 09:24:42.792803 master-0 kubenswrapper[15202]: I0319 09:24:42.792768 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="083882c0-ea2f-4405-8cf1-cce5b91fe602" volumeName="kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config" seLinuxMountContext="" Mar 19 09:24:42.792803 master-0 kubenswrapper[15202]: I0319 09:24:42.792792 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1dd59466-0133-41fe-a648-28db73aa861b" volumeName="kubernetes.io/empty-dir/1dd59466-0133-41fe-a648-28db73aa861b-cache" seLinuxMountContext="" Mar 19 09:24:42.792867 master-0 kubenswrapper[15202]: I0319 09:24:42.792803 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31742478-0d83-48cf-b38b-02416d95d4a8" volumeName="kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.792867 master-0 kubenswrapper[15202]: I0319 09:24:42.792816 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86c4b0e4-3481-465d-b00f-022d2c58c183" volumeName="kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v" seLinuxMountContext="" Mar 19 09:24:42.792867 master-0 kubenswrapper[15202]: I0319 09:24:42.792831 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db42b38e-294e-4016-8ac1-54126ac60de8" volumeName="kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-kube-api-access-8dwx6" seLinuxMountContext="" Mar 19 09:24:42.792867 master-0 kubenswrapper[15202]: I0319 09:24:42.792847 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" volumeName="kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.792867 master-0 kubenswrapper[15202]: I0319 09:24:42.792867 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="141cb120-92da-4d8d-bc29-fc4c433a6336" volumeName="kubernetes.io/projected/141cb120-92da-4d8d-bc29-fc4c433a6336-kube-api-access-fhwd7" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.792884 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="208939f5-8fca-4fd5-b0c6-43484b7d1e30" volumeName="kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.792896 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.792908 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31742478-0d83-48cf-b38b-02416d95d4a8" volumeName="kubernetes.io/projected/31742478-0d83-48cf-b38b-02416d95d4a8-kube-api-access-wz7d6" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.792948 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8aa0f17a-287e-4a19-9a59-4913e7707071" volumeName="kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.792984 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a16f6f-437c-4da5-a797-287e5e1ddbd4" volumeName="kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.793000 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd42096c-f18d-4bb5-8a51-8761dc1edb73" volumeName="kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert" seLinuxMountContext="" Mar 19 09:24:42.793024 master-0 kubenswrapper[15202]: I0319 09:24:42.793019 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd69fc33-59d4-4538-b4ec-e2d08ac11f72" volumeName="kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-utilities" seLinuxMountContext="" Mar 19 09:24:42.793237 master-0 kubenswrapper[15202]: I0319 09:24:42.793034 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0d16aa2-494d-4a65-880d-3d87219178b5" volumeName="kubernetes.io/projected/f0d16aa2-494d-4a65-880d-3d87219178b5-kube-api-access-fsdjh" seLinuxMountContext="" Mar 19 09:24:42.793237 master-0 kubenswrapper[15202]: I0319 09:24:42.793051 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cb70a30-a8d1-4037-81e6-eb4f0510a234" volumeName="kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:24:42.793237 master-0 kubenswrapper[15202]: I0319 09:24:42.793065 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/projected/3a4fd337-c385-4f56-965c-d68ee0a4e848-kube-api-access-vr9dj" seLinuxMountContext="" Mar 19 09:24:42.793237 master-0 kubenswrapper[15202]: I0319 09:24:42.793079 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd" seLinuxMountContext="" Mar 19 09:24:42.793433 master-0 kubenswrapper[15202]: I0319 09:24:42.793106 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca" seLinuxMountContext="" Mar 19 09:24:42.793516 master-0 kubenswrapper[15202]: I0319 09:24:42.793434 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw" seLinuxMountContext="" Mar 19 09:24:42.793516 master-0 kubenswrapper[15202]: I0319 09:24:42.793450 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fedd4b33-c90e-42d5-bc29-73d1701bb671" volumeName="kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.793516 master-0 kubenswrapper[15202]: I0319 09:24:42.793469 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cb70a30-a8d1-4037-81e6-eb4f0510a234" volumeName="kubernetes.io/secret/0cb70a30-a8d1-4037-81e6-eb4f0510a234-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.793516 master-0 kubenswrapper[15202]: I0319 09:24:42.793496 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="307605e6-d1cf-4172-8e7d-918c435f3577" volumeName="kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54" seLinuxMountContext="" Mar 19 09:24:42.793516 master-0 kubenswrapper[15202]: I0319 09:24:42.793510 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fda0d28-6511-4577-9cd3-58a9c1a64d4e" volumeName="kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-tmp" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793528 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dea35f60-33be-4ccc-b985-952eac3a85c0" volumeName="kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793542 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86c4b0e4-3481-465d-b00f-022d2c58c183" volumeName="kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793560 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d32541c9-eef6-417c-9f5a-a7392dc70aa0" volumeName="kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793573 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d52fa1ad-0071-4506-bb94-e73d6f15a75c" volumeName="kubernetes.io/projected/d52fa1ad-0071-4506-bb94-e73d6f15a75c-kube-api-access-xvg4q" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793590 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793617 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd69fc33-59d4-4538-b4ec-e2d08ac11f72" volumeName="kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-catalog-content" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793631 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="083882c0-ea2f-4405-8cf1-cce5b91fe602" volumeName="kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793662 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4256d841-23cb-4756-b827-f44ee6e54def" volumeName="kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr" seLinuxMountContext="" Mar 19 09:24:42.793683 master-0 kubenswrapper[15202]: I0319 09:24:42.793684 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793701 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793727 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d90f590a-6118-4769-b18f-fec67dd62c20" volumeName="kubernetes.io/projected/d90f590a-6118-4769-b18f-fec67dd62c20-kube-api-access-nljb2" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793747 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0d16aa2-494d-4a65-880d-3d87219178b5" volumeName="kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793770 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1943401-a75b-4e45-8c65-3cc36018d8c4" volumeName="kubernetes.io/projected/f1943401-a75b-4e45-8c65-3cc36018d8c4-kube-api-access-8cxfs" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793791 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793815 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793830 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-serving-ca" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793849 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d017ee-b94e-402f-90c1-ccb3f336b2a8" volumeName="kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793864 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793877 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1dd59466-0133-41fe-a648-28db73aa861b" volumeName="kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-ca-certs" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793894 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793915 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" volumeName="kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793929 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89b0e82c-1cd1-45aa-9cab-2d11320a1ff7" volumeName="kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-utilities" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793966 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ac42112-6a00-4c17-b230-75b565aa668f" volumeName="kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793981 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d017ee-b94e-402f-90c1-ccb3f336b2a8" volumeName="kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.793998 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d017ee-b94e-402f-90c1-ccb3f336b2a8" volumeName="kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.794012 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="141cb120-92da-4d8d-bc29-fc4c433a6336" volumeName="kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.794027 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d90f590a-6118-4769-b18f-fec67dd62c20" volumeName="kubernetes.io/configmap/d90f590a-6118-4769-b18f-fec67dd62c20-signing-cabundle" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.794045 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ece5177b-ae15-4c33-a8d4-612ab50b2b8b" volumeName="kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.794058 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce38ec35-8f00-4060-a620-1759a6bbef66" volumeName="kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.794074 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1dd59466-0133-41fe-a648-28db73aa861b" volumeName="kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-kube-api-access-gzntq" seLinuxMountContext="" Mar 19 09:24:42.794066 master-0 kubenswrapper[15202]: I0319 09:24:42.794088 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33e92e5d-61ea-45b2-b357-ebffdaebf4af" volumeName="kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794103 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89b0e82c-1cd1-45aa-9cab-2d11320a1ff7" volumeName="kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-catalog-content" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794127 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794144 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/projected/b2bff8a5-c45d-4d28-8771-2239ad0fa578-kube-api-access-s2ntw" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794163 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cb70a30-a8d1-4037-81e6-eb4f0510a234" volumeName="kubernetes.io/projected/0cb70a30-a8d1-4037-81e6-eb4f0510a234-kube-api-access-q7x89" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794176 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="310d604b-fe9a-4b19-b8b5-7a1983e45e67" volumeName="kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794190 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" volumeName="kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794206 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce38ec35-8f00-4060-a620-1759a6bbef66" volumeName="kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794220 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc9945ac-4041-4120-b504-a173c2bf91bd" volumeName="kubernetes.io/configmap/dc9945ac-4041-4120-b504-a173c2bf91bd-service-ca" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794237 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ac42112-6a00-4c17-b230-75b565aa668f" volumeName="kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794254 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794269 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-encryption-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794289 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-policies" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794303 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0d16aa2-494d-4a65-880d-3d87219178b5" volumeName="kubernetes.io/empty-dir/f0d16aa2-494d-4a65-880d-3d87219178b5-tmpfs" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794319 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33e92e5d-61ea-45b2-b357-ebffdaebf4af" volumeName="kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794333 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc" volumeName="kubernetes.io/projected/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc-kube-api-access-2zz2n" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794349 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a823c8bc-09ef-46a9-a1f3-155a34b89788" volumeName="kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794366 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794382 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-client" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794396 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" volumeName="kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794415 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dea35f60-33be-4ccc-b985-952eac3a85c0" volumeName="kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794427 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794444 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f93b8728-4a33-4ee4-b7c6-cff7d7995953" volumeName="kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794459 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fedd4b33-c90e-42d5-bc29-73d1701bb671" volumeName="kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794475 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d017ee-b94e-402f-90c1-ccb3f336b2a8" volumeName="kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794589 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794604 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" volumeName="kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794619 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d32541c9-eef6-417c-9f5a-a7392dc70aa0" volumeName="kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794633 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db42b38e-294e-4016-8ac1-54126ac60de8" volumeName="kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-ca-certs" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794656 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01d017ee-b94e-402f-90c1-ccb3f336b2a8" volumeName="kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794676 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-image-import-ca" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794691 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd42096c-f18d-4bb5-8a51-8761dc1edb73" volumeName="kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794708 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dea35f60-33be-4ccc-b985-952eac3a85c0" volumeName="kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794725 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794740 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-client" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794765 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-encryption-config" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794787 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a16f6f-437c-4da5-a797-287e5e1ddbd4" volumeName="kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794803 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86c4b0e4-3481-465d-b00f-022d2c58c183" volumeName="kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794818 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89be0036-a2c8-48b4-9eaf-17fab972c4f4" volumeName="kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794831 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ca444a4-4d78-456f-9656-0c28076ce77e" volumeName="kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794855 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides" seLinuxMountContext="" Mar 19 09:24:42.794841 master-0 kubenswrapper[15202]: I0319 09:24:42.794890 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.794902 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="083882c0-ea2f-4405-8cf1-cce5b91fe602" volumeName="kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.794928 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.794941 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" volumeName="kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.794957 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd42096c-f18d-4bb5-8a51-8761dc1edb73" volumeName="kubernetes.io/projected/cd42096c-f18d-4bb5-8a51-8761dc1edb73-kube-api-access-dxdb6" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.794973 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9ebcecb-c210-434e-83a1-825265e206f1" volumeName="kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.794994 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="872e5f8c-b014-4283-a4d2-0e2cfd29e192" volumeName="kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795073 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="872e5f8c-b014-4283-a4d2-0e2cfd29e192" volumeName="kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795134 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795152 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a417fe25-4aca-471c-941d-c195b6141042" volumeName="kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795861 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a75049de-dcf1-4102-b339-f45d5015adea" volumeName="kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795942 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1dd59466-0133-41fe-a648-28db73aa861b" volumeName="kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795961 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795983 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" volumeName="kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.795999 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a823c8bc-09ef-46a9-a1f3-155a34b89788" volumeName="kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.796012 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8f39c16-3a94-45c3-a51c-f2e81eff967d" volumeName="kubernetes.io/projected/b8f39c16-3a94-45c3-a51c-f2e81eff967d-kube-api-access-qmdlx" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.796025 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bec90db1-02e3-4211-8c33-f8bcc304e3a7" volumeName="kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.796046 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.796061 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a75049de-dcf1-4102-b339-f45d5015adea" volumeName="kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config" seLinuxMountContext="" Mar 19 09:24:42.796111 master-0 kubenswrapper[15202]: I0319 09:24:42.796075 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2bff8a5-c45d-4d28-8771-2239ad0fa578" volumeName="kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796094 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" volumeName="kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796210 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="310d604b-fe9a-4b19-b8b5-7a1983e45e67" volumeName="kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796230 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796248 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b36f3b2-caf9-40ad-a3a1-e83796142f54" volumeName="kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796262 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fda0d28-6511-4577-9cd3-58a9c1a64d4e" volumeName="kubernetes.io/projected/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-kube-api-access-rnfsx" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796277 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796292 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8f39c16-3a94-45c3-a51c-f2e81eff967d" volumeName="kubernetes.io/configmap/b8f39c16-3a94-45c3-a51c-f2e81eff967d-config-volume" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796337 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd42096c-f18d-4bb5-8a51-8761dc1edb73" volumeName="kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796366 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce38ec35-8f00-4060-a620-1759a6bbef66" volumeName="kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796390 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796411 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39bf78ac-304b-4b82-8729-d184657ef3bb" volumeName="kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-utilities" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796428 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ad3ef11-90df-40b1-acbf-ed9b0c708ddb" volumeName="kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796443 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3376275-294d-446d-9b4c-930df60dba01" volumeName="kubernetes.io/projected/e3376275-294d-446d-9b4c-930df60dba01-kube-api-access-cgsm7" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796460 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fedd4b33-c90e-42d5-bc29-73d1701bb671" volumeName="kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796493 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8fdab32-4e61-4e9c-a506-52121f625669" volumeName="kubernetes.io/secret/f8fdab32-4e61-4e9c-a506-52121f625669-webhook-certs" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796631 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f2148fe-f9f6-47da-894c-b88dae360ebe" volumeName="kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796654 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ca444a4-4d78-456f-9656-0c28076ce77e" volumeName="kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796670 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b2898746-6827-41d9-ac88-64206cb84ac9" volumeName="kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert" seLinuxMountContext="" Mar 19 09:24:42.796691 master-0 kubenswrapper[15202]: I0319 09:24:42.796692 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39bf78ac-304b-4b82-8729-d184657ef3bb" volumeName="kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-catalog-content" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.796716 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.796738 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ac42112-6a00-4c17-b230-75b565aa668f" volumeName="kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: E0319 09:24:42.796748 15202 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.796760 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33e92e5d-61ea-45b2-b357-ebffdaebf4af" volumeName="kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.796831 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b36f3b2-caf9-40ad-a3a1-e83796142f54" volumeName="kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.796914 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797021 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823" volumeName="kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797055 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ac42112-6a00-4c17-b230-75b565aa668f" volumeName="kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797073 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cb70a30-a8d1-4037-81e6-eb4f0510a234" volumeName="kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-service-ca-bundle" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797089 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f2148fe-f9f6-47da-894c-b88dae360ebe" volumeName="kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797106 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="208939f5-8fca-4fd5-b0c6-43484b7d1e30" volumeName="kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797170 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f93b8728-4a33-4ee4-b7c6-cff7d7995953" volumeName="kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797185 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc9945ac-4041-4120-b504-a173c2bf91bd" volumeName="kubernetes.io/projected/dc9945ac-4041-4120-b504-a173c2bf91bd-kube-api-access" seLinuxMountContext="" Mar 19 09:24:42.797237 master-0 kubenswrapper[15202]: I0319 09:24:42.797209 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f0d16aa2-494d-4a65-880d-3d87219178b5" volumeName="kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797225 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8fdab32-4e61-4e9c-a506-52121f625669" volumeName="kubernetes.io/projected/f8fdab32-4e61-4e9c-a506-52121f625669-kube-api-access-5xl5z" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797338 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d32541c9-eef6-417c-9f5a-a7392dc70aa0" volumeName="kubernetes.io/projected/d32541c9-eef6-417c-9f5a-a7392dc70aa0-kube-api-access-fvp9m" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797354 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e09725c2-45c6-4a60-b817-6e5316d6f8e8" volumeName="kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797490 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" volumeName="kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797510 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797523 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96902651-8e2b-44c2-be80-0a8c7c28cb58" volumeName="kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797537 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" volumeName="kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797551 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" volumeName="kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797566 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823" volumeName="kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797583 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797599 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a823c8bc-09ef-46a9-a1f3-155a34b89788" volumeName="kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797619 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="357980ba-1957-412f-afb5-04281eca2bee" volumeName="kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797638 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-serving-ca" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797657 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a4fd337-c385-4f56-965c-d68ee0a4e848" volumeName="kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-config" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797673 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bec90db1-02e3-4211-8c33-f8bcc304e3a7" volumeName="kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797688 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce38ec35-8f00-4060-a620-1759a6bbef66" volumeName="kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797705 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d486ce23-acf7-429a-9739-4770e1a2bf78" volumeName="kubernetes.io/secret/d486ce23-acf7-429a-9739-4770e1a2bf78-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797722 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a8e2194-aba6-4929-a29c-47c63c8ff799" volumeName="kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca" seLinuxMountContext="" Mar 19 09:24:42.797726 master-0 kubenswrapper[15202]: I0319 09:24:42.797737 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9663cc40-a69d-42ba-890e-071cb85062f5" volumeName="kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797756 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd69fc33-59d4-4538-b4ec-e2d08ac11f72" volumeName="kubernetes.io/projected/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-kube-api-access-txp58" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797773 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f93b8728-4a33-4ee4-b7c6-cff7d7995953" volumeName="kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797790 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="310d604b-fe9a-4b19-b8b5-7a1983e45e67" volumeName="kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797809 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="467c2f01-2c23-41e2-acb9-08a84061fefc" volumeName="kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797835 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="467c2f01-2c23-41e2-acb9-08a84061fefc" volumeName="kubernetes.io/projected/467c2f01-2c23-41e2-acb9-08a84061fefc-kube-api-access-mxtcq" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797853 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a16f6f-437c-4da5-a797-287e5e1ddbd4" volumeName="kubernetes.io/projected/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-kube-api-access-ws5kr" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797870 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd42096c-f18d-4bb5-8a51-8761dc1edb73" volumeName="kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797887 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fedd4b33-c90e-42d5-bc29-73d1701bb671" volumeName="kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797902 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fda0d28-6511-4577-9cd3-58a9c1a64d4e" volumeName="kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-tuned" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797920 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="89b0e82c-1cd1-45aa-9cab-2d11320a1ff7" volumeName="kubernetes.io/projected/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-kube-api-access-n49x9" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797938 15202 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ca444a4-4d78-456f-9656-0c28076ce77e" volumeName="kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config" seLinuxMountContext="" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797952 15202 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:24:42.798285 master-0 kubenswrapper[15202]: I0319 09:24:42.797963 15202 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:24:42.807552 master-0 kubenswrapper[15202]: I0319 09:24:42.807424 15202 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:24:42.810734 master-0 kubenswrapper[15202]: I0319 09:24:42.810656 15202 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:24:42.810872 master-0 kubenswrapper[15202]: I0319 09:24:42.810742 15202 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:24:42.810872 master-0 kubenswrapper[15202]: I0319 09:24:42.810771 15202 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:24:42.810949 master-0 kubenswrapper[15202]: E0319 09:24:42.810882 15202 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 09:24:42.812584 master-0 kubenswrapper[15202]: W0319 09:24:42.812527 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:42.812649 master-0 kubenswrapper[15202]: E0319 09:24:42.812608 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:42.820603 master-0 kubenswrapper[15202]: I0319 09:24:42.820562 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-xhzf9_083882c0-ea2f-4405-8cf1-cce5b91fe602/openshift-controller-manager-operator/0.log" Mar 19 09:24:42.820729 master-0 kubenswrapper[15202]: I0319 09:24:42.820609 15202 generic.go:334] "Generic (PLEG): container finished" podID="083882c0-ea2f-4405-8cf1-cce5b91fe602" containerID="787b47766f4f361558a231cbdd8f60cfc309ddb2f5ce9e60ddd25ab14ca4bf8c" exitCode=1 Mar 19 09:24:42.834587 master-0 kubenswrapper[15202]: I0319 09:24:42.834532 15202 generic.go:334] "Generic (PLEG): container finished" podID="a823c8bc-09ef-46a9-a1f3-155a34b89788" containerID="fe703627bf17490741c98c350c37ad5f26868d707caaf28e298dbcd09ba6eb50" exitCode=0 Mar 19 09:24:42.836453 master-0 kubenswrapper[15202]: I0319 09:24:42.836417 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_2de53594-9dcc-4318-806a-64f39ef76b3b/installer/0.log" Mar 19 09:24:42.836453 master-0 kubenswrapper[15202]: I0319 09:24:42.836449 15202 generic.go:334] "Generic (PLEG): container finished" podID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerID="921b32f57f187453279e5e8112c07cdf7b75d2182a8ace33d227749c1f7857e9" exitCode=1 Mar 19 09:24:42.839725 master-0 kubenswrapper[15202]: I0319 09:24:42.839683 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kqb2h_b2898746-6827-41d9-ac88-64206cb84ac9/approver/0.log" Mar 19 09:24:42.840415 master-0 kubenswrapper[15202]: I0319 09:24:42.840367 15202 generic.go:334] "Generic (PLEG): container finished" podID="b2898746-6827-41d9-ac88-64206cb84ac9" containerID="5f66b7b4498be8ffcef1be07d5415ae49ca99cf0c15b74518d97c2537613d5cc" exitCode=1 Mar 19 09:24:42.847045 master-0 kubenswrapper[15202]: I0319 09:24:42.846995 15202 generic.go:334] "Generic (PLEG): container finished" podID="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" containerID="a481a6ff530440a1264d2535843bd9da5aad52194298733f7093828af5a8bb83" exitCode=0 Mar 19 09:24:42.847045 master-0 kubenswrapper[15202]: I0319 09:24:42.847036 15202 generic.go:334] "Generic (PLEG): container finished" podID="7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8" containerID="82689d1e71e4b8853162fd6caae5b840062273cb60c91d420d169ba6d7d40278" exitCode=0 Mar 19 09:24:42.851708 master-0 kubenswrapper[15202]: I0319 09:24:42.851237 15202 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d" exitCode=1 Mar 19 09:24:42.855807 master-0 kubenswrapper[15202]: I0319 09:24:42.855748 15202 generic.go:334] "Generic (PLEG): container finished" podID="b2bff8a5-c45d-4d28-8771-2239ad0fa578" containerID="a48adae5f84d07444bee0a5da7af010f18bdba5c7270d3b00d241369bd585daa" exitCode=0 Mar 19 09:24:42.858144 master-0 kubenswrapper[15202]: I0319 09:24:42.858098 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/0.log" Mar 19 09:24:42.858262 master-0 kubenswrapper[15202]: I0319 09:24:42.858146 15202 generic.go:334] "Generic (PLEG): container finished" podID="6a8e2194-aba6-4929-a29c-47c63c8ff799" containerID="d43b2cecb46ee4d7282d2377662b9eb7bab83399567784e4db2c8496f2616648" exitCode=1 Mar 19 09:24:42.860927 master-0 kubenswrapper[15202]: I0319 09:24:42.860766 15202 generic.go:334] "Generic (PLEG): container finished" podID="5b36f3b2-caf9-40ad-a3a1-e83796142f54" containerID="a9e3c64428edfb89f548d2d0f11b93a4546a142c8d9ea26eed5c6670f21e1d16" exitCode=0 Mar 19 09:24:42.870875 master-0 kubenswrapper[15202]: I0319 09:24:42.869993 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_dc248e59-1519-4ac3-9005-2239214a8d62/installer/0.log" Mar 19 09:24:42.870875 master-0 kubenswrapper[15202]: I0319 09:24:42.870077 15202 generic.go:334] "Generic (PLEG): container finished" podID="dc248e59-1519-4ac3-9005-2239214a8d62" containerID="2b23049d85d383fc87e2217ac4c88730e6accf178c37b42720c1211cad94765e" exitCode=1 Mar 19 09:24:42.873456 master-0 kubenswrapper[15202]: I0319 09:24:42.873398 15202 generic.go:334] "Generic (PLEG): container finished" podID="33e92e5d-61ea-45b2-b357-ebffdaebf4af" containerID="e567b2a6970dbbdd6d360830a8ee46fec46945b28639df21bdc4828de4e3065b" exitCode=0 Mar 19 09:24:42.880175 master-0 kubenswrapper[15202]: I0319 09:24:42.880138 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-pn5gg_db42b38e-294e-4016-8ac1-54126ac60de8/manager/0.log" Mar 19 09:24:42.880667 master-0 kubenswrapper[15202]: E0319 09:24:42.880635 15202 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:24:42.880743 master-0 kubenswrapper[15202]: I0319 09:24:42.880193 15202 generic.go:334] "Generic (PLEG): container finished" podID="db42b38e-294e-4016-8ac1-54126ac60de8" containerID="35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50" exitCode=1 Mar 19 09:24:42.883039 master-0 kubenswrapper[15202]: I0319 09:24:42.883009 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svct_872e5f8c-b014-4283-a4d2-0e2cfd29e192/kube-multus/0.log" Mar 19 09:24:42.883089 master-0 kubenswrapper[15202]: I0319 09:24:42.883043 15202 generic.go:334] "Generic (PLEG): container finished" podID="872e5f8c-b014-4283-a4d2-0e2cfd29e192" containerID="b504737085975340ca235cec0c4c9e74b2eb5d8b9a50455476ac176eb78b4a5c" exitCode=1 Mar 19 09:24:42.891396 master-0 kubenswrapper[15202]: I0319 09:24:42.891363 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 09:24:42.891819 master-0 kubenswrapper[15202]: I0319 09:24:42.891766 15202 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11" exitCode=1 Mar 19 09:24:42.891819 master-0 kubenswrapper[15202]: I0319 09:24:42.891810 15202 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="c218293403aa861a38085870e890bceedfe5394df8d5e259c54d305af3fdeae9" exitCode=0 Mar 19 09:24:42.893271 master-0 kubenswrapper[15202]: I0319 09:24:42.893236 15202 generic.go:334] "Generic (PLEG): container finished" podID="dd69fc33-59d4-4538-b4ec-e2d08ac11f72" containerID="8fa2aedcd94c8a914c06f3267aec5df548ae61bbfade5d0ba8f849928a4839e1" exitCode=0 Mar 19 09:24:42.896053 master-0 kubenswrapper[15202]: I0319 09:24:42.896019 15202 generic.go:334] "Generic (PLEG): container finished" podID="89b0e82c-1cd1-45aa-9cab-2d11320a1ff7" containerID="cf86a9f840243b51077e44de7146e420b0ec2bfabf64c651e8c74a472013cdb5" exitCode=0 Mar 19 09:24:42.905795 master-0 kubenswrapper[15202]: I0319 09:24:42.905730 15202 generic.go:334] "Generic (PLEG): container finished" podID="39bf78ac-304b-4b82-8729-d184657ef3bb" containerID="833893fe28da658713401cd9bbaf4ee6973b0a664d7435b398ca89d99977b122" exitCode=0 Mar 19 09:24:42.911062 master-0 kubenswrapper[15202]: E0319 09:24:42.911007 15202 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:24:42.911138 master-0 kubenswrapper[15202]: I0319 09:24:42.911026 15202 generic.go:334] "Generic (PLEG): container finished" podID="9663cc40-a69d-42ba-890e-071cb85062f5" containerID="cdf18d2610050197f807cf4a5fc0308ba6a5aa77b434d76558194e6bb3ba81d0" exitCode=0 Mar 19 09:24:42.915813 master-0 kubenswrapper[15202]: I0319 09:24:42.915748 15202 generic.go:334] "Generic (PLEG): container finished" podID="89be0036-a2c8-48b4-9eaf-17fab972c4f4" containerID="5d59e82ae91c2ed1c8a992bffe58e7eea15792d208f2b71cb72f5ee7bff4f994" exitCode=0 Mar 19 09:24:42.923600 master-0 kubenswrapper[15202]: I0319 09:24:42.923535 15202 generic.go:334] "Generic (PLEG): container finished" podID="357980ba-1957-412f-afb5-04281eca2bee" containerID="fdd9285acae300c3c00a66ae69c66c3dae68ae6703f408d0bdc875283085bf0e" exitCode=0 Mar 19 09:24:42.944406 master-0 kubenswrapper[15202]: I0319 09:24:42.944170 15202 generic.go:334] "Generic (PLEG): container finished" podID="8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823" containerID="4903db04251051a54ad7e347003826304ccc0327af5e8e5393199af2a3df5cfe" exitCode=0 Mar 19 09:24:42.947173 master-0 kubenswrapper[15202]: I0319 09:24:42.947124 15202 generic.go:334] "Generic (PLEG): container finished" podID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerID="9b28c300e3439abe307f50e88ba8ce2d925b14966bafd61f93ba6a56066cd1f7" exitCode=0 Mar 19 09:24:42.951013 master-0 kubenswrapper[15202]: I0319 09:24:42.950976 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9" exitCode=0 Mar 19 09:24:42.951013 master-0 kubenswrapper[15202]: I0319 09:24:42.951009 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f" exitCode=0 Mar 19 09:24:42.951154 master-0 kubenswrapper[15202]: I0319 09:24:42.951019 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391" exitCode=0 Mar 19 09:24:42.957549 master-0 kubenswrapper[15202]: I0319 09:24:42.957436 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-96qpx_86c4b0e4-3481-465d-b00f-022d2c58c183/openshift-apiserver-operator/1.log" Mar 19 09:24:42.957549 master-0 kubenswrapper[15202]: I0319 09:24:42.957525 15202 generic.go:334] "Generic (PLEG): container finished" podID="86c4b0e4-3481-465d-b00f-022d2c58c183" containerID="d8a756b9b58a3ce072eadde280ccd4f57de1077de86a738e2697b1425743281c" exitCode=255 Mar 19 09:24:42.961653 master-0 kubenswrapper[15202]: I0319 09:24:42.961599 15202 generic.go:334] "Generic (PLEG): container finished" podID="3a4fd337-c385-4f56-965c-d68ee0a4e848" containerID="87f01015e01422976c49ff53ddbd24b82fcd8498b6ca2d45f75d0b8a77fa808e" exitCode=0 Mar 19 09:24:42.970562 master-0 kubenswrapper[15202]: I0319 09:24:42.970499 15202 generic.go:334] "Generic (PLEG): container finished" podID="f1943401-a75b-4e45-8c65-3cc36018d8c4" containerID="ac22f5d33f89532c3f245e5d78a3e4b4931118bf3ea5e137f52cf13514162a71" exitCode=0 Mar 19 09:24:42.976189 master-0 kubenswrapper[15202]: I0319 09:24:42.976133 15202 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8" exitCode=2 Mar 19 09:24:42.977949 master-0 kubenswrapper[15202]: I0319 09:24:42.977910 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_0df23b55-3dea-4f5e-9d53-5c7755ea4e48/installer/0.log" Mar 19 09:24:42.978078 master-0 kubenswrapper[15202]: I0319 09:24:42.977957 15202 generic.go:334] "Generic (PLEG): container finished" podID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerID="cbac5fecef5ccbfed911c8dc762330e4e21b1d157632cde1feee52ece3850c21" exitCode=1 Mar 19 09:24:42.979834 master-0 kubenswrapper[15202]: I0319 09:24:42.979806 15202 generic.go:334] "Generic (PLEG): container finished" podID="310d604b-fe9a-4b19-b8b5-7a1983e45e67" containerID="f349a28ea0bb985b97d809f46b60d5c4412444c67eeb0389e91efb0430bb6dcb" exitCode=0 Mar 19 09:24:42.980931 master-0 kubenswrapper[15202]: E0319 09:24:42.980895 15202 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:24:42.982357 master-0 kubenswrapper[15202]: I0319 09:24:42.982322 15202 generic.go:334] "Generic (PLEG): container finished" podID="62d3ca81-26e1-4625-a3aa-b1eabd31cfd6" containerID="be05318150c766720e5d230c0bf2401720113751ff91aa74d2d72ed4d56c5f47" exitCode=0 Mar 19 09:24:42.982799 master-0 kubenswrapper[15202]: E0319 09:24:42.982761 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:24:42.986146 master-0 kubenswrapper[15202]: I0319 09:24:42.986082 15202 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="157ec68d28f9ad49e7460cf4325702e32a61a87e98a342a6b3f00e830966c9b0" exitCode=0 Mar 19 09:24:42.992361 master-0 kubenswrapper[15202]: I0319 09:24:42.992328 15202 generic.go:334] "Generic (PLEG): container finished" podID="96902651-8e2b-44c2-be80-0a8c7c28cb58" containerID="df60facd7b253794e244b5462531d7a854ab92c89e6e7a5b56683d4b99824cfc" exitCode=0 Mar 19 09:24:42.998905 master-0 kubenswrapper[15202]: I0319 09:24:42.998860 15202 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="dee05648403cf8d6ee35acba18e21f4c87a759e5c8fc08c0570622f3df3f33e1" exitCode=0 Mar 19 09:24:42.998905 master-0 kubenswrapper[15202]: I0319 09:24:42.998898 15202 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="a5f501670eb3ea46a2e9833a8efe0358489fe82196edec8a883f420d084aeb16" exitCode=0 Mar 19 09:24:42.998905 master-0 kubenswrapper[15202]: I0319 09:24:42.998906 15202 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="b1fd1a1a09332960aaf03f0be319bfd31ad0e612d2387b20f773844856dcefe5" exitCode=0 Mar 19 09:24:42.998905 master-0 kubenswrapper[15202]: I0319 09:24:42.998914 15202 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="665177f0301e1fc60d7ae832223fecb7c16c65e7cc5cfa86a5c6a63e7efdc407" exitCode=0 Mar 19 09:24:42.998905 master-0 kubenswrapper[15202]: I0319 09:24:42.998921 15202 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="7335f4e870393336ecca59a320d7b43e9c33ca895a7a0816d7e753f6c020f7af" exitCode=0 Mar 19 09:24:42.999317 master-0 kubenswrapper[15202]: I0319 09:24:42.998948 15202 generic.go:334] "Generic (PLEG): container finished" podID="e9ebcecb-c210-434e-83a1-825265e206f1" containerID="a2f44163a580069fe9b4a06584e3e0baeea817a9f7b28d2b1b8dc2d50f42ba8a" exitCode=0 Mar 19 09:24:43.003460 master-0 kubenswrapper[15202]: I0319 09:24:43.003403 15202 generic.go:334] "Generic (PLEG): container finished" podID="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" containerID="b4cd172092883e2c59c413605caa9eda30c5b4011ddd9168033acc5dfa87297f" exitCode=0 Mar 19 09:24:43.003460 master-0 kubenswrapper[15202]: I0319 09:24:43.003445 15202 generic.go:334] "Generic (PLEG): container finished" podID="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" containerID="0147b737bd4c355c56866c4e60a80701e47045be188939cc9ec3ede186a99781" exitCode=0 Mar 19 09:24:43.003460 master-0 kubenswrapper[15202]: I0319 09:24:43.003454 15202 generic.go:334] "Generic (PLEG): container finished" podID="f216606b-43d0-43d0-a3e3-a3ee2952e7b8" containerID="3efe12fe8fe63c4780aeba64aa817d31d700162f5f08cf1695416899a639c633" exitCode=0 Mar 19 09:24:43.005060 master-0 kubenswrapper[15202]: I0319 09:24:43.005014 15202 generic.go:334] "Generic (PLEG): container finished" podID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerID="89a2fc8df576416ddd348c57ed4c730f0abfa16882e2a3cc4358c65c4a9606ca" exitCode=0 Mar 19 09:24:43.008897 master-0 kubenswrapper[15202]: I0319 09:24:43.008860 15202 generic.go:334] "Generic (PLEG): container finished" podID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerID="ddf97e1b992b687ae1658f8b5cc4c1c01ae45509b7aaa2768e80614c358636c9" exitCode=0 Mar 19 09:24:43.014523 master-0 kubenswrapper[15202]: I0319 09:24:43.013873 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw_a75049de-dcf1-4102-b339-f45d5015adea/kube-storage-version-migrator-operator/1.log" Mar 19 09:24:43.014523 master-0 kubenswrapper[15202]: I0319 09:24:43.013943 15202 generic.go:334] "Generic (PLEG): container finished" podID="a75049de-dcf1-4102-b339-f45d5015adea" containerID="42b9a79d42542a10355bd1a462df5ffb67f1a10eae7fe6919eb834123087d197" exitCode=255 Mar 19 09:24:43.081006 master-0 kubenswrapper[15202]: E0319 09:24:43.080968 15202 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:24:43.107335 master-0 kubenswrapper[15202]: E0319 09:24:43.107172 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e33d2945d16b3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:24:42.755831475 +0000 UTC m=+0.141246291,LastTimestamp:2026-03-19 09:24:42.755831475 +0000 UTC m=+0.141246291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:24:43.111363 master-0 kubenswrapper[15202]: E0319 09:24:43.111314 15202 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:24:43.147833 master-0 kubenswrapper[15202]: I0319 09:24:43.147779 15202 manager.go:324] Recovery completed Mar 19 09:24:43.181152 master-0 kubenswrapper[15202]: E0319 09:24:43.181060 15202 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 09:24:43.219534 master-0 kubenswrapper[15202]: I0319 09:24:43.219398 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.221560 master-0 kubenswrapper[15202]: I0319 09:24:43.221539 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.221616 master-0 kubenswrapper[15202]: I0319 09:24:43.221600 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.221616 master-0 kubenswrapper[15202]: I0319 09:24:43.221611 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.224351 master-0 kubenswrapper[15202]: I0319 09:24:43.224317 15202 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:24:43.224351 master-0 kubenswrapper[15202]: I0319 09:24:43.224338 15202 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:24:43.224473 master-0 kubenswrapper[15202]: I0319 09:24:43.224358 15202 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:24:43.224565 master-0 kubenswrapper[15202]: I0319 09:24:43.224535 15202 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 09:24:43.224608 master-0 kubenswrapper[15202]: I0319 09:24:43.224555 15202 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 09:24:43.224608 master-0 kubenswrapper[15202]: I0319 09:24:43.224579 15202 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 09:24:43.224608 master-0 kubenswrapper[15202]: I0319 09:24:43.224588 15202 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 09:24:43.224608 master-0 kubenswrapper[15202]: I0319 09:24:43.224596 15202 policy_none.go:49] "None policy: Start" Mar 19 09:24:43.227451 master-0 kubenswrapper[15202]: I0319 09:24:43.227406 15202 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:24:43.227451 master-0 kubenswrapper[15202]: I0319 09:24:43.227450 15202 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:24:43.227693 master-0 kubenswrapper[15202]: I0319 09:24:43.227665 15202 state_mem.go:75] "Updated machine memory state" Mar 19 09:24:43.227693 master-0 kubenswrapper[15202]: I0319 09:24:43.227682 15202 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 09:24:43.237933 master-0 kubenswrapper[15202]: I0319 09:24:43.237868 15202 manager.go:334] "Starting Device Plugin manager" Mar 19 09:24:43.237933 master-0 kubenswrapper[15202]: I0319 09:24:43.237910 15202 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:24:43.237933 master-0 kubenswrapper[15202]: I0319 09:24:43.237923 15202 server.go:79] "Starting device plugin registration server" Mar 19 09:24:43.238580 master-0 kubenswrapper[15202]: I0319 09:24:43.238422 15202 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:24:43.238580 master-0 kubenswrapper[15202]: I0319 09:24:43.238436 15202 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:24:43.238820 master-0 kubenswrapper[15202]: I0319 09:24:43.238680 15202 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:24:43.238820 master-0 kubenswrapper[15202]: I0319 09:24:43.238771 15202 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:24:43.238820 master-0 kubenswrapper[15202]: I0319 09:24:43.238779 15202 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:24:43.258747 master-0 kubenswrapper[15202]: E0319 09:24:43.258630 15202 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:24:43.339285 master-0 kubenswrapper[15202]: I0319 09:24:43.339182 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.341794 master-0 kubenswrapper[15202]: I0319 09:24:43.341737 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.341794 master-0 kubenswrapper[15202]: I0319 09:24:43.341769 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.341794 master-0 kubenswrapper[15202]: I0319 09:24:43.341777 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.341794 master-0 kubenswrapper[15202]: I0319 09:24:43.341798 15202 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:24:43.342826 master-0 kubenswrapper[15202]: E0319 09:24:43.342749 15202 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:24:43.384413 master-0 kubenswrapper[15202]: E0319 09:24:43.384324 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:24:43.512657 master-0 kubenswrapper[15202]: I0319 09:24:43.512377 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:24:43.512657 master-0 kubenswrapper[15202]: I0319 09:24:43.512525 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.515221 master-0 kubenswrapper[15202]: I0319 09:24:43.515173 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.515221 master-0 kubenswrapper[15202]: I0319 09:24:43.515225 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.515384 master-0 kubenswrapper[15202]: I0319 09:24:43.515235 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.515423 master-0 kubenswrapper[15202]: I0319 09:24:43.515402 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.515602 master-0 kubenswrapper[15202]: I0319 09:24:43.515560 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.518594 master-0 kubenswrapper[15202]: I0319 09:24:43.518557 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.518594 master-0 kubenswrapper[15202]: I0319 09:24:43.518595 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.518785 master-0 kubenswrapper[15202]: I0319 09:24:43.518571 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.518785 master-0 kubenswrapper[15202]: I0319 09:24:43.518628 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.518785 master-0 kubenswrapper[15202]: I0319 09:24:43.518638 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.518785 master-0 kubenswrapper[15202]: I0319 09:24:43.518609 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.518935 master-0 kubenswrapper[15202]: I0319 09:24:43.518880 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.518995 master-0 kubenswrapper[15202]: I0319 09:24:43.518965 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.522628 master-0 kubenswrapper[15202]: I0319 09:24:43.521875 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.522628 master-0 kubenswrapper[15202]: I0319 09:24:43.522622 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.522628 master-0 kubenswrapper[15202]: I0319 09:24:43.522731 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.524253 master-0 kubenswrapper[15202]: I0319 09:24:43.524206 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.524473 master-0 kubenswrapper[15202]: I0319 09:24:43.524432 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.524473 master-0 kubenswrapper[15202]: I0319 09:24:43.524458 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.525837 master-0 kubenswrapper[15202]: I0319 09:24:43.525593 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.526022 master-0 kubenswrapper[15202]: I0319 09:24:43.525981 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.528262 master-0 kubenswrapper[15202]: I0319 09:24:43.528225 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.528262 master-0 kubenswrapper[15202]: I0319 09:24:43.528260 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.528371 master-0 kubenswrapper[15202]: I0319 09:24:43.528274 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.530284 master-0 kubenswrapper[15202]: I0319 09:24:43.530217 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.530284 master-0 kubenswrapper[15202]: I0319 09:24:43.530285 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.530395 master-0 kubenswrapper[15202]: I0319 09:24:43.530302 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.530473 master-0 kubenswrapper[15202]: I0319 09:24:43.530441 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.530673 master-0 kubenswrapper[15202]: I0319 09:24:43.530630 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.533188 master-0 kubenswrapper[15202]: I0319 09:24:43.533152 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.533342 master-0 kubenswrapper[15202]: I0319 09:24:43.533295 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.533423 master-0 kubenswrapper[15202]: I0319 09:24:43.533351 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.533659 master-0 kubenswrapper[15202]: I0319 09:24:43.533628 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.533984 master-0 kubenswrapper[15202]: I0319 09:24:43.533947 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.534035 master-0 kubenswrapper[15202]: I0319 09:24:43.534004 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.534098 master-0 kubenswrapper[15202]: I0319 09:24:43.534075 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.534135 master-0 kubenswrapper[15202]: I0319 09:24:43.534099 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.537541 master-0 kubenswrapper[15202]: I0319 09:24:43.537460 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.537541 master-0 kubenswrapper[15202]: I0319 09:24:43.537521 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.537541 master-0 kubenswrapper[15202]: I0319 09:24:43.537547 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.538956 master-0 kubenswrapper[15202]: I0319 09:24:43.538913 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.538956 master-0 kubenswrapper[15202]: I0319 09:24:43.538952 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.539103 master-0 kubenswrapper[15202]: I0319 09:24:43.538963 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.539157 master-0 kubenswrapper[15202]: I0319 09:24:43.539134 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846174bbc21aaf0dbb6863b67ef55a4060d549089aa7226a91ee6bec43a301c1" Mar 19 09:24:43.539237 master-0 kubenswrapper[15202]: I0319 09:24:43.539170 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"6081e5f52de3fc4dc3f746460dde01bf5beff21d46d2be6b213ee24cc51a7282"} Mar 19 09:24:43.539237 master-0 kubenswrapper[15202]: I0319 09:24:43.539233 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d"} Mar 19 09:24:43.539308 master-0 kubenswrapper[15202]: I0319 09:24:43.539248 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"71c394faadffb1d1d025aba30e8b78502ffdbdb82f02d4937b0a94dcc10adf15"} Mar 19 09:24:43.539308 master-0 kubenswrapper[15202]: I0319 09:24:43.539285 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="430cc5e4e962eae018339493051d7d67829497881a372c3f753b7b26f53dfd82" Mar 19 09:24:43.539377 master-0 kubenswrapper[15202]: I0319 09:24:43.539318 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6f21e047d7fe1c17012e8b0e2eccf0c0df41f1dd7af47ee16ae785f35047af4" Mar 19 09:24:43.539377 master-0 kubenswrapper[15202]: I0319 09:24:43.539332 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"e046e1ab5ed34b841248a951c60543dfca2a668c2cdbbcdc17996eec0b9a0bfb"} Mar 19 09:24:43.539377 master-0 kubenswrapper[15202]: I0319 09:24:43.539342 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"c49a6a019bc3b37ac531ff227e2d3d6370ee40933900fe53c00def76b6a2ea11"} Mar 19 09:24:43.539377 master-0 kubenswrapper[15202]: I0319 09:24:43.539345 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.539536 master-0 kubenswrapper[15202]: I0319 09:24:43.539353 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"c218293403aa861a38085870e890bceedfe5394df8d5e259c54d305af3fdeae9"} Mar 19 09:24:43.539536 master-0 kubenswrapper[15202]: I0319 09:24:43.539507 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"4b5ac55eaeb240fc51eb94a3c1fa8bc29a0ef164ccb6e67fd2c9653989350931"} Mar 19 09:24:43.539613 master-0 kubenswrapper[15202]: I0319 09:24:43.539543 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"4801b7b4c9bb4aca19f4e1af1002ed5d","Type":"ContainerStarted","Data":"ca6f9d86f4547c04b74275fdc5a8bdf830a7730f563977f4e36b33b777ca5d0d"} Mar 19 09:24:43.539613 master-0 kubenswrapper[15202]: I0319 09:24:43.539598 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="180c49c03a35395c4e92171521a2f80d367dc88d5d80b54b142ce2e921c63f26" Mar 19 09:24:43.539613 master-0 kubenswrapper[15202]: I0319 09:24:43.539607 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539616 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539634 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539642 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539651 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539659 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539672 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539707 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539718 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"319f3eae52d20a2ec527a014335891e8e573a6a3f8a960e1f80f21e3f46c5210"} Mar 19 09:24:43.539723 master-0 kubenswrapper[15202]: I0319 09:24:43.539728 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"44e894b303bf6b07415200fe11b3cc2f55e9c844a1695d0cc00770ec72ab5afb"} Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539766 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"8c6bf6e4dc06dc33ce2a60a0abd7d0a106b6973ee1336f65f910e0cb73c9c346"} Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539778 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8"} Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539788 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a7909254e1fd575ef7a679770eb6617922c50b1fbb682ef07075bcdacdc5e021"} Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539797 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede"} Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539809 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6cfce77c340f4bb4a16a10098e49742c092d4ba5982fa86ee07da43d113194" Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539831 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68155acc818fe44730d87075246aa6bb7a8626c9d9ae55e511e6d1b689d90334" Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539874 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c5cd2c130a06e83a755f581cc3a20c2c3dce618468e51c158559ad4071da8b" Mar 19 09:24:43.540021 master-0 kubenswrapper[15202]: I0319 09:24:43.539887 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9589bbab032e262b4d7aedeb656ab180a0c26f2d3e71118ea25c48ac0d07f6bd" Mar 19 09:24:43.541386 master-0 kubenswrapper[15202]: I0319 09:24:43.541347 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.541386 master-0 kubenswrapper[15202]: I0319 09:24:43.541383 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.541483 master-0 kubenswrapper[15202]: I0319 09:24:43.541395 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.543084 master-0 kubenswrapper[15202]: I0319 09:24:43.543046 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.544960 master-0 kubenswrapper[15202]: I0319 09:24:43.544926 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.544960 master-0 kubenswrapper[15202]: I0319 09:24:43.544953 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.545093 master-0 kubenswrapper[15202]: I0319 09:24:43.544964 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.545093 master-0 kubenswrapper[15202]: I0319 09:24:43.544982 15202 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:24:43.545702 master-0 kubenswrapper[15202]: E0319 09:24:43.545669 15202 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:24:43.608331 master-0 kubenswrapper[15202]: I0319 09:24:43.608248 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.608331 master-0 kubenswrapper[15202]: I0319 09:24:43.608318 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608351 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608377 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608435 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608455 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608508 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608531 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608550 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:24:43.608603 master-0 kubenswrapper[15202]: I0319 09:24:43.608572 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.608898 master-0 kubenswrapper[15202]: I0319 09:24:43.608649 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.608898 master-0 kubenswrapper[15202]: I0319 09:24:43.608690 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:24:43.608898 master-0 kubenswrapper[15202]: I0319 09:24:43.608720 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.608898 master-0 kubenswrapper[15202]: I0319 09:24:43.608749 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.609015 master-0 kubenswrapper[15202]: I0319 09:24:43.608868 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.609015 master-0 kubenswrapper[15202]: I0319 09:24:43.608931 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.609015 master-0 kubenswrapper[15202]: I0319 09:24:43.608955 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.609015 master-0 kubenswrapper[15202]: I0319 09:24:43.608982 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.609015 master-0 kubenswrapper[15202]: I0319 09:24:43.609003 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:24:43.609338 master-0 kubenswrapper[15202]: I0319 09:24:43.609027 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.609338 master-0 kubenswrapper[15202]: I0319 09:24:43.609071 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.609338 master-0 kubenswrapper[15202]: I0319 09:24:43.609100 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.710865 master-0 kubenswrapper[15202]: I0319 09:24:43.710797 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.710879 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.710938 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.710968 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.710990 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711014 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711051 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711070 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711085 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711054 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711112 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711115 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711146 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711166 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711180 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711198 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711220 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711253 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711249 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711279 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711307 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711316 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711336 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711339 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711287 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711342 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711383 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711396 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711418 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711422 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711368 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711455 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711504 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711505 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711556 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711449 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711583 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711614 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711632 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711662 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711669 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711692 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711698 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711670 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711714 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.711977 master-0 kubenswrapper[15202]: I0319 09:24:43.711692 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:43.758792 master-0 kubenswrapper[15202]: I0319 09:24:43.758711 15202 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:43.790657 master-0 kubenswrapper[15202]: W0319 09:24:43.790496 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:43.790657 master-0 kubenswrapper[15202]: E0319 09:24:43.790598 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:43.871289 master-0 kubenswrapper[15202]: W0319 09:24:43.871188 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:43.871354 master-0 kubenswrapper[15202]: E0319 09:24:43.871279 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:43.946304 master-0 kubenswrapper[15202]: I0319 09:24:43.946251 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:43.948728 master-0 kubenswrapper[15202]: I0319 09:24:43.948683 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:43.948728 master-0 kubenswrapper[15202]: I0319 09:24:43.948726 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:43.948839 master-0 kubenswrapper[15202]: I0319 09:24:43.948736 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:43.948839 master-0 kubenswrapper[15202]: I0319 09:24:43.948764 15202 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:24:43.949614 master-0 kubenswrapper[15202]: E0319 09:24:43.949575 15202 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:24:44.023929 master-0 kubenswrapper[15202]: I0319 09:24:44.023882 15202 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="6081e5f52de3fc4dc3f746460dde01bf5beff21d46d2be6b213ee24cc51a7282" exitCode=1 Mar 19 09:24:44.024042 master-0 kubenswrapper[15202]: I0319 09:24:44.023993 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:44.024876 master-0 kubenswrapper[15202]: I0319 09:24:44.024592 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"6081e5f52de3fc4dc3f746460dde01bf5beff21d46d2be6b213ee24cc51a7282"} Mar 19 09:24:44.024876 master-0 kubenswrapper[15202]: I0319 09:24:44.024669 15202 scope.go:117] "RemoveContainer" containerID="fb358362b8cb28eafac3f9aba109f76ce567ce8d3db682847c789685409b9e4d" Mar 19 09:24:44.024876 master-0 kubenswrapper[15202]: I0319 09:24:44.024701 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:44.024994 master-0 kubenswrapper[15202]: I0319 09:24:44.024858 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:44.024994 master-0 kubenswrapper[15202]: I0319 09:24:44.024870 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:44.026284 master-0 kubenswrapper[15202]: I0319 09:24:44.025965 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:44.026284 master-0 kubenswrapper[15202]: I0319 09:24:44.025987 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:44.026284 master-0 kubenswrapper[15202]: I0319 09:24:44.025996 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:44.028169 master-0 kubenswrapper[15202]: I0319 09:24:44.028133 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:44.028169 master-0 kubenswrapper[15202]: I0319 09:24:44.028156 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:44.028169 master-0 kubenswrapper[15202]: I0319 09:24:44.028165 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:44.028493 master-0 kubenswrapper[15202]: I0319 09:24:44.028403 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:44.028493 master-0 kubenswrapper[15202]: I0319 09:24:44.028447 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:44.028493 master-0 kubenswrapper[15202]: I0319 09:24:44.028465 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:44.028990 master-0 kubenswrapper[15202]: I0319 09:24:44.028955 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:44.028990 master-0 kubenswrapper[15202]: I0319 09:24:44.028990 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:44.029086 master-0 kubenswrapper[15202]: I0319 09:24:44.029003 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:44.029373 master-0 kubenswrapper[15202]: I0319 09:24:44.029339 15202 scope.go:117] "RemoveContainer" containerID="6081e5f52de3fc4dc3f746460dde01bf5beff21d46d2be6b213ee24cc51a7282" Mar 19 09:24:44.169754 master-0 kubenswrapper[15202]: W0319 09:24:44.169666 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:44.170028 master-0 kubenswrapper[15202]: E0319 09:24:44.169977 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:44.185571 master-0 kubenswrapper[15202]: E0319 09:24:44.185504 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:24:44.251603 master-0 kubenswrapper[15202]: W0319 09:24:44.249511 15202 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:44.251603 master-0 kubenswrapper[15202]: E0319 09:24:44.249634 15202 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:24:44.750762 master-0 kubenswrapper[15202]: I0319 09:24:44.750667 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:44.754816 master-0 kubenswrapper[15202]: I0319 09:24:44.754761 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:44.754816 master-0 kubenswrapper[15202]: I0319 09:24:44.754812 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:44.754960 master-0 kubenswrapper[15202]: I0319 09:24:44.754824 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:44.754960 master-0 kubenswrapper[15202]: I0319 09:24:44.754856 15202 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:24:44.755547 master-0 kubenswrapper[15202]: E0319 09:24:44.755506 15202 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 09:24:44.758413 master-0 kubenswrapper[15202]: I0319 09:24:44.758375 15202 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 09:24:45.032038 master-0 kubenswrapper[15202]: I0319 09:24:45.031882 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce" exitCode=0 Mar 19 09:24:45.032038 master-0 kubenswrapper[15202]: I0319 09:24:45.031927 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerDied","Data":"b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce"} Mar 19 09:24:45.032038 master-0 kubenswrapper[15202]: I0319 09:24:45.031911 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:45.034326 master-0 kubenswrapper[15202]: I0319 09:24:45.034287 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:45.034533 master-0 kubenswrapper[15202]: I0319 09:24:45.034511 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:45.034687 master-0 kubenswrapper[15202]: I0319 09:24:45.034663 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:45.035149 master-0 kubenswrapper[15202]: I0319 09:24:45.035094 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:45.035292 master-0 kubenswrapper[15202]: I0319 09:24:45.035137 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"92a2db24929eebeb86c10e4da2210d08ce4c067d7696a9c259054e240344e6fa"} Mar 19 09:24:45.037314 master-0 kubenswrapper[15202]: I0319 09:24:45.036946 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:45.037314 master-0 kubenswrapper[15202]: I0319 09:24:45.036965 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:45.037314 master-0 kubenswrapper[15202]: I0319 09:24:45.036973 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:45.037314 master-0 kubenswrapper[15202]: I0319 09:24:45.037171 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:45.038456 master-0 kubenswrapper[15202]: I0319 09:24:45.038428 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"4801b7b4c9bb4aca19f4e1af1002ed5d","Type":"ContainerStarted","Data":"528b303c1aa0e5650b031fceefeae9a2856d906d524b7139df21f2091e40d442"} Mar 19 09:24:45.038750 master-0 kubenswrapper[15202]: I0319 09:24:45.038728 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:45.038887 master-0 kubenswrapper[15202]: I0319 09:24:45.038853 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:45.038887 master-0 kubenswrapper[15202]: I0319 09:24:45.038885 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:45.038995 master-0 kubenswrapper[15202]: I0319 09:24:45.038894 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:45.041698 master-0 kubenswrapper[15202]: I0319 09:24:45.041676 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:45.041859 master-0 kubenswrapper[15202]: I0319 09:24:45.041846 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:45.041988 master-0 kubenswrapper[15202]: I0319 09:24:45.041972 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:46.080747 master-0 kubenswrapper[15202]: I0319 09:24:46.080672 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a"} Mar 19 09:24:46.080747 master-0 kubenswrapper[15202]: I0319 09:24:46.080749 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788"} Mar 19 09:24:46.081273 master-0 kubenswrapper[15202]: I0319 09:24:46.080707 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:46.081273 master-0 kubenswrapper[15202]: I0319 09:24:46.080808 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:46.085701 master-0 kubenswrapper[15202]: I0319 09:24:46.085677 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:46.085797 master-0 kubenswrapper[15202]: I0319 09:24:46.085713 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:46.085797 master-0 kubenswrapper[15202]: I0319 09:24:46.085723 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:46.085797 master-0 kubenswrapper[15202]: I0319 09:24:46.085678 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:46.085908 master-0 kubenswrapper[15202]: I0319 09:24:46.085836 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:46.085976 master-0 kubenswrapper[15202]: I0319 09:24:46.085916 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:46.358617 master-0 kubenswrapper[15202]: I0319 09:24:46.357964 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:46.361820 master-0 kubenswrapper[15202]: I0319 09:24:46.361765 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:46.361820 master-0 kubenswrapper[15202]: I0319 09:24:46.361819 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:46.361947 master-0 kubenswrapper[15202]: I0319 09:24:46.361839 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:46.361947 master-0 kubenswrapper[15202]: I0319 09:24:46.361875 15202 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 09:24:46.472584 master-0 kubenswrapper[15202]: I0319 09:24:46.472395 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:46.472914 master-0 kubenswrapper[15202]: I0319 09:24:46.472674 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:46.475911 master-0 kubenswrapper[15202]: I0319 09:24:46.475857 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:46.476038 master-0 kubenswrapper[15202]: I0319 09:24:46.475946 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:46.476038 master-0 kubenswrapper[15202]: I0319 09:24:46.475965 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:47.090601 master-0 kubenswrapper[15202]: I0319 09:24:47.090531 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"21a9ca68aca58418f611d967784b8b2e15b3acfa4bde8394a7537d1e53b9f6af"} Mar 19 09:24:47.090601 master-0 kubenswrapper[15202]: I0319 09:24:47.090593 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486"} Mar 19 09:24:47.090601 master-0 kubenswrapper[15202]: I0319 09:24:47.090608 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84"} Mar 19 09:24:48.095846 master-0 kubenswrapper[15202]: I0319 09:24:48.095793 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:48.097944 master-0 kubenswrapper[15202]: I0319 09:24:48.097898 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:48.098024 master-0 kubenswrapper[15202]: I0319 09:24:48.097954 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:48.098024 master-0 kubenswrapper[15202]: I0319 09:24:48.097967 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:48.803674 master-0 kubenswrapper[15202]: I0319 09:24:48.803613 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 09:24:48.803889 master-0 kubenswrapper[15202]: I0319 09:24:48.803840 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:48.809781 master-0 kubenswrapper[15202]: I0319 09:24:48.809730 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:48.809781 master-0 kubenswrapper[15202]: I0319 09:24:48.809778 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:48.809781 master-0 kubenswrapper[15202]: I0319 09:24:48.809790 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:50.107442 master-0 kubenswrapper[15202]: I0319 09:24:50.107342 15202 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="8c6bf6e4dc06dc33ce2a60a0abd7d0a106b6973ee1336f65f910e0cb73c9c346" exitCode=1 Mar 19 09:24:50.107442 master-0 kubenswrapper[15202]: I0319 09:24:50.107411 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"8c6bf6e4dc06dc33ce2a60a0abd7d0a106b6973ee1336f65f910e0cb73c9c346"} Mar 19 09:24:50.108178 master-0 kubenswrapper[15202]: I0319 09:24:50.107520 15202 scope.go:117] "RemoveContainer" containerID="705708ba128bb3bfbebedfc2ce68d8ee8e42b244c59d6b7831204ffa0bd15bc8" Mar 19 09:24:50.108178 master-0 kubenswrapper[15202]: I0319 09:24:50.107782 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:50.109831 master-0 kubenswrapper[15202]: I0319 09:24:50.109784 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:50.109914 master-0 kubenswrapper[15202]: I0319 09:24:50.109838 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:50.109914 master-0 kubenswrapper[15202]: I0319 09:24:50.109851 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:50.110303 master-0 kubenswrapper[15202]: I0319 09:24:50.110272 15202 scope.go:117] "RemoveContainer" containerID="8c6bf6e4dc06dc33ce2a60a0abd7d0a106b6973ee1336f65f910e0cb73c9c346" Mar 19 09:24:50.416502 master-0 kubenswrapper[15202]: I0319 09:24:50.416357 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:50.443593 master-0 kubenswrapper[15202]: I0319 09:24:50.443532 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:50.872249 master-0 kubenswrapper[15202]: I0319 09:24:50.872149 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 09:24:50.872571 master-0 kubenswrapper[15202]: I0319 09:24:50.872356 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:50.874962 master-0 kubenswrapper[15202]: I0319 09:24:50.874894 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:50.875115 master-0 kubenswrapper[15202]: I0319 09:24:50.874972 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:50.875115 master-0 kubenswrapper[15202]: I0319 09:24:50.874984 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:51.872598 master-0 kubenswrapper[15202]: I0319 09:24:51.872513 15202 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Startup probe status=failure output="Get \"https://192.168.32.10:9980/readyz\": context deadline exceeded" start-of-body= Mar 19 09:24:51.873118 master-0 kubenswrapper[15202]: I0319 09:24:51.872648 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" probeResult="failure" output="Get \"https://192.168.32.10:9980/readyz\": context deadline exceeded" Mar 19 09:24:52.119367 master-0 kubenswrapper[15202]: I0319 09:24:52.119268 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411"} Mar 19 09:24:52.252538 master-0 kubenswrapper[15202]: I0319 09:24:52.252458 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:52.252738 master-0 kubenswrapper[15202]: I0319 09:24:52.252655 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:52.254884 master-0 kubenswrapper[15202]: I0319 09:24:52.254865 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:52.254959 master-0 kubenswrapper[15202]: I0319 09:24:52.254891 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:52.254959 master-0 kubenswrapper[15202]: I0319 09:24:52.254901 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:52.270653 master-0 kubenswrapper[15202]: I0319 09:24:52.270585 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:52.273781 master-0 kubenswrapper[15202]: I0319 09:24:52.273723 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:52.273869 master-0 kubenswrapper[15202]: I0319 09:24:52.273787 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:53.073232 master-0 kubenswrapper[15202]: I0319 09:24:53.073150 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: I0319 09:24:53.078290 15202 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]log ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [-]etcd failed: reason withheld Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiextensions-informers ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/crd-informer-synced ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/bootstrap-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-registration-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]autoregister-completion ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 09:24:53.078355 master-0 kubenswrapper[15202]: livez check failed Mar 19 09:24:53.079705 master-0 kubenswrapper[15202]: I0319 09:24:53.078365 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:24:53.128554 master-0 kubenswrapper[15202]: I0319 09:24:53.128495 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:53.128875 master-0 kubenswrapper[15202]: I0319 09:24:53.128495 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:53.132036 master-0 kubenswrapper[15202]: I0319 09:24:53.132000 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:53.132036 master-0 kubenswrapper[15202]: I0319 09:24:53.132029 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:53.132036 master-0 kubenswrapper[15202]: I0319 09:24:53.132039 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:53.132579 master-0 kubenswrapper[15202]: I0319 09:24:53.132465 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:53.132579 master-0 kubenswrapper[15202]: I0319 09:24:53.132593 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:53.132695 master-0 kubenswrapper[15202]: I0319 09:24:53.132606 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:53.226505 master-0 kubenswrapper[15202]: I0319 09:24:53.226401 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:53.231659 master-0 kubenswrapper[15202]: I0319 09:24:53.231599 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:53.258836 master-0 kubenswrapper[15202]: E0319 09:24:53.258789 15202 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 09:24:54.133732 master-0 kubenswrapper[15202]: I0319 09:24:54.133676 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:54.134209 master-0 kubenswrapper[15202]: I0319 09:24:54.133676 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:54.137341 master-0 kubenswrapper[15202]: I0319 09:24:54.137293 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:54.137341 master-0 kubenswrapper[15202]: I0319 09:24:54.137344 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:54.137501 master-0 kubenswrapper[15202]: I0319 09:24:54.137354 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:54.137899 master-0 kubenswrapper[15202]: I0319 09:24:54.137869 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:54.137949 master-0 kubenswrapper[15202]: I0319 09:24:54.137924 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:54.137949 master-0 kubenswrapper[15202]: I0319 09:24:54.137934 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:54.138634 master-0 kubenswrapper[15202]: I0319 09:24:54.138600 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:24:55.139246 master-0 kubenswrapper[15202]: I0319 09:24:55.139176 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:55.141268 master-0 kubenswrapper[15202]: I0319 09:24:55.141197 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:55.141268 master-0 kubenswrapper[15202]: I0319 09:24:55.141250 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:55.141268 master-0 kubenswrapper[15202]: I0319 09:24:55.141276 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:56.143538 master-0 kubenswrapper[15202]: I0319 09:24:56.143481 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:24:56.145506 master-0 kubenswrapper[15202]: I0319 09:24:56.145457 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:24:56.145599 master-0 kubenswrapper[15202]: I0319 09:24:56.145518 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:24:56.145599 master-0 kubenswrapper[15202]: I0319 09:24:56.145531 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:24:56.957968 master-0 kubenswrapper[15202]: E0319 09:24:56.957911 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="3.2s" Mar 19 09:24:57.134079 master-0 kubenswrapper[15202]: I0319 09:24:57.133978 15202 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" start-of-body= Mar 19 09:24:57.134079 master-0 kubenswrapper[15202]: I0319 09:24:57.134070 15202 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: I0319 09:24:58.865988 15202 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]log ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]etcd ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiextensions-informers ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/crd-informer-synced ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/bootstrap-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-registration-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]autoregister-completion ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 09:24:58.866082 master-0 kubenswrapper[15202]: livez check failed Mar 19 09:24:58.869086 master-0 kubenswrapper[15202]: I0319 09:24:58.866100 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:25:00.416842 master-0 kubenswrapper[15202]: I0319 09:25:00.416779 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:25:00.417276 master-0 kubenswrapper[15202]: I0319 09:25:00.416960 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:25:00.419281 master-0 kubenswrapper[15202]: I0319 09:25:00.419240 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:25:00.419281 master-0 kubenswrapper[15202]: I0319 09:25:00.419280 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:25:00.419418 master-0 kubenswrapper[15202]: I0319 09:25:00.419291 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:25:00.886332 master-0 kubenswrapper[15202]: I0319 09:25:00.886240 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 09:25:00.886606 master-0 kubenswrapper[15202]: I0319 09:25:00.886437 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:25:00.888516 master-0 kubenswrapper[15202]: I0319 09:25:00.888491 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:25:00.888648 master-0 kubenswrapper[15202]: I0319 09:25:00.888527 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:25:00.888648 master-0 kubenswrapper[15202]: I0319 09:25:00.888539 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:25:00.898898 master-0 kubenswrapper[15202]: I0319 09:25:00.898838 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 09:25:01.171299 master-0 kubenswrapper[15202]: I0319 09:25:01.171175 15202 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:25:01.173943 master-0 kubenswrapper[15202]: I0319 09:25:01.173893 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 09:25:01.173943 master-0 kubenswrapper[15202]: I0319 09:25:01.173944 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 09:25:01.174080 master-0 kubenswrapper[15202]: I0319 09:25:01.173958 15202 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 09:25:01.961088 master-0 kubenswrapper[15202]: I0319 09:25:01.961033 15202 trace.go:236] Trace[379598221]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:24:46.255) (total time: 15705ms): Mar 19 09:25:01.961088 master-0 kubenswrapper[15202]: Trace[379598221]: ---"Objects listed" error: 15705ms (09:25:01.960) Mar 19 09:25:01.961088 master-0 kubenswrapper[15202]: Trace[379598221]: [15.705975587s] [15.705975587s] END Mar 19 09:25:01.961088 master-0 kubenswrapper[15202]: I0319 09:25:01.961063 15202 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:25:01.961818 master-0 kubenswrapper[15202]: I0319 09:25:01.961182 15202 trace.go:236] Trace[1799248378]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:24:46.490) (total time: 15470ms): Mar 19 09:25:01.961818 master-0 kubenswrapper[15202]: Trace[1799248378]: ---"Objects listed" error: 15470ms (09:25:01.961) Mar 19 09:25:01.961818 master-0 kubenswrapper[15202]: Trace[1799248378]: [15.470094919s] [15.470094919s] END Mar 19 09:25:01.961818 master-0 kubenswrapper[15202]: I0319 09:25:01.961226 15202 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:25:01.962418 master-0 kubenswrapper[15202]: I0319 09:25:01.962381 15202 trace.go:236] Trace[1342008557]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:24:45.476) (total time: 16486ms): Mar 19 09:25:01.962418 master-0 kubenswrapper[15202]: Trace[1342008557]: ---"Objects listed" error: 16485ms (09:25:01.962) Mar 19 09:25:01.962418 master-0 kubenswrapper[15202]: Trace[1342008557]: [16.486080258s] [16.486080258s] END Mar 19 09:25:01.962418 master-0 kubenswrapper[15202]: I0319 09:25:01.962406 15202 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:25:02.254238 master-0 kubenswrapper[15202]: I0319 09:25:02.254019 15202 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" start-of-body= Mar 19 09:25:02.254238 master-0 kubenswrapper[15202]: I0319 09:25:02.254173 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" Mar 19 09:25:02.257587 master-0 kubenswrapper[15202]: I0319 09:25:02.257500 15202 trace.go:236] Trace[1023430868]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:24:47.248) (total time: 15008ms): Mar 19 09:25:02.257587 master-0 kubenswrapper[15202]: Trace[1023430868]: ---"Objects listed" error: 15008ms (09:25:02.257) Mar 19 09:25:02.257587 master-0 kubenswrapper[15202]: Trace[1023430868]: [15.008693623s] [15.008693623s] END Mar 19 09:25:02.257587 master-0 kubenswrapper[15202]: I0319 09:25:02.257547 15202 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:25:02.273966 master-0 kubenswrapper[15202]: I0319 09:25:02.273864 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:25:02.766620 master-0 kubenswrapper[15202]: I0319 09:25:02.766144 15202 apiserver.go:52] "Watching apiserver" Mar 19 09:25:02.790034 master-0 kubenswrapper[15202]: I0319 09:25:02.789952 15202 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:25:02.791144 master-0 kubenswrapper[15202]: I0319 09:25:02.791093 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0","openshift-dns/node-resolver-pmxm8","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc","openshift-kube-apiserver/installer-1-master-0","openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5","assisted-installer/assisted-installer-controller-gn85g","openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz","openshift-apiserver/apiserver-54cd8888b9-q4ztg","openshift-multus/multus-8svct","openshift-network-diagnostics/network-check-target-95w9b","openshift-network-node-identity/network-node-identity-kqb2h","openshift-machine-config-operator/machine-config-daemon-hgc52","openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm","openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869","kube-system/bootstrap-kube-scheduler-master-0","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb","openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2","openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws","openshift-controller-manager/controller-manager-6f9655dc5d-8lp25","openshift-kube-apiserver/installer-2-master-0","openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx","openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv","openshift-cluster-version/cluster-version-operator-7d58488df-thkn2","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j","openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp","openshift-etcd/etcd-master-0","openshift-insights/insights-operator-68bf6ff9d6-wshz8","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-multus/multus-additional-cni-plugins-tjzdb","openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd","openshift-service-ca/service-ca-79bc6b8d76-xlhg9","openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb","openshift-marketplace/certified-operators-tkx45","openshift-marketplace/redhat-marketplace-wzz6n","openshift-network-operator/iptables-alerter-2s58d","openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn","openshift-cluster-node-tuning-operator/tuned-vkw4s","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw","openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk","openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv","openshift-ovn-kubernetes/ovnkube-node-fwjzr","openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5","openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5","openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq","openshift-dns/dns-default-p88qq","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-multus/network-metrics-daemon-p76jz","openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5","openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp","openshift-marketplace/redhat-operators-zpvpd","openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2","openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498","openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2","openshift-dns-operator/dns-operator-9c5679d8f-fdxtp","openshift-etcd/installer-1-master-0","openshift-kube-controller-manager/installer-2-master-0","openshift-kube-scheduler/installer-4-master-0","openshift-marketplace/community-operators-wqngb","openshift-marketplace/marketplace-operator-89ccd998f-6qck2","openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk","openshift-network-operator/network-operator-7bd846bfc4-jxvxl","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6"] Mar 19 09:25:02.791338 master-0 kubenswrapper[15202]: I0319 09:25:02.791312 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-gn85g" Mar 19 09:25:02.805356 master-0 kubenswrapper[15202]: I0319 09:25:02.802152 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.805356 master-0 kubenswrapper[15202]: I0319 09:25:02.804432 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:25:02.805356 master-0 kubenswrapper[15202]: I0319 09:25:02.805193 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:25:02.806520 master-0 kubenswrapper[15202]: I0319 09:25:02.806418 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:25:02.807441 master-0 kubenswrapper[15202]: I0319 09:25:02.807077 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.807441 master-0 kubenswrapper[15202]: I0319 09:25:02.807164 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:25:02.808934 master-0 kubenswrapper[15202]: I0319 09:25:02.808902 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:25:02.810561 master-0 kubenswrapper[15202]: I0319 09:25:02.810435 15202 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="7f0ee125-e760-4bd1-a88b-8e71716de6b8" Mar 19 09:25:02.811850 master-0 kubenswrapper[15202]: I0319 09:25:02.811788 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:25:02.812100 master-0 kubenswrapper[15202]: I0319 09:25:02.812022 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:25:02.812242 master-0 kubenswrapper[15202]: I0319 09:25:02.812207 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.812508 master-0 kubenswrapper[15202]: I0319 09:25:02.812486 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:25:02.812572 master-0 kubenswrapper[15202]: I0319 09:25:02.812535 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.813229 master-0 kubenswrapper[15202]: I0319 09:25:02.812690 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:25:02.813229 master-0 kubenswrapper[15202]: I0319 09:25:02.812761 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:25:02.813768 master-0 kubenswrapper[15202]: I0319 09:25:02.813729 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.813932 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814019 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814154 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814257 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814421 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814449 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814521 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814581 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814618 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814653 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814673 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814677 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814591 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814755 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814522 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814791 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814822 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814867 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814940 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814961 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814940 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.815013 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.815029 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.815058 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814684 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814869 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.815101 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.814962 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.815239 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:25:02.815435 master-0 kubenswrapper[15202]: I0319 09:25:02.815304 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:25:02.816452 master-0 kubenswrapper[15202]: I0319 09:25:02.815827 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 09:25:02.816452 master-0 kubenswrapper[15202]: I0319 09:25:02.815867 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 09:25:02.816452 master-0 kubenswrapper[15202]: I0319 09:25:02.815931 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:25:02.816452 master-0 kubenswrapper[15202]: I0319 09:25:02.816208 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:25:02.816452 master-0 kubenswrapper[15202]: I0319 09:25:02.816384 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 09:25:02.816688 master-0 kubenswrapper[15202]: I0319 09:25:02.816539 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:25:02.816688 master-0 kubenswrapper[15202]: I0319 09:25:02.816617 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.816940 master-0 kubenswrapper[15202]: I0319 09:25:02.816912 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:25:02.821727 master-0 kubenswrapper[15202]: I0319 09:25:02.820912 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:25:02.821727 master-0 kubenswrapper[15202]: I0319 09:25:02.821026 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:25:02.821727 master-0 kubenswrapper[15202]: I0319 09:25:02.821089 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:25:02.822507 master-0 kubenswrapper[15202]: I0319 09:25:02.822457 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:25:02.822633 master-0 kubenswrapper[15202]: I0319 09:25:02.822612 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:25:02.822762 master-0 kubenswrapper[15202]: I0319 09:25:02.822728 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.823108 master-0 kubenswrapper[15202]: I0319 09:25:02.823088 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:25:02.823282 master-0 kubenswrapper[15202]: I0319 09:25:02.823268 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.823438 master-0 kubenswrapper[15202]: I0319 09:25:02.823425 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:25:02.827059 master-0 kubenswrapper[15202]: I0319 09:25:02.827008 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:25:02.828103 master-0 kubenswrapper[15202]: I0319 09:25:02.828077 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:25:02.828526 master-0 kubenswrapper[15202]: I0319 09:25:02.828405 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:25:02.828617 master-0 kubenswrapper[15202]: I0319 09:25:02.828597 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:25:02.830867 master-0 kubenswrapper[15202]: I0319 09:25:02.830835 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:02.834197 master-0 kubenswrapper[15202]: I0319 09:25:02.834143 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:25:02.834648 master-0 kubenswrapper[15202]: I0319 09:25:02.834630 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:25:02.834766 master-0 kubenswrapper[15202]: I0319 09:25:02.834739 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:25:02.834849 master-0 kubenswrapper[15202]: I0319 09:25:02.834641 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:25:02.835005 master-0 kubenswrapper[15202]: I0319 09:25:02.834669 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:25:02.835216 master-0 kubenswrapper[15202]: I0319 09:25:02.834679 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:25:02.835390 master-0 kubenswrapper[15202]: I0319 09:25:02.835042 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:25:02.836817 master-0 kubenswrapper[15202]: I0319 09:25:02.835637 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.836901 master-0 kubenswrapper[15202]: I0319 09:25:02.835460 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:25:02.837074 master-0 kubenswrapper[15202]: I0319 09:25:02.835814 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.837529 master-0 kubenswrapper[15202]: I0319 09:25:02.835821 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:25:02.837529 master-0 kubenswrapper[15202]: I0319 09:25:02.835859 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:25:02.837778 master-0 kubenswrapper[15202]: I0319 09:25:02.835928 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:25:02.837921 master-0 kubenswrapper[15202]: I0319 09:25:02.837897 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:25:02.837956 master-0 kubenswrapper[15202]: I0319 09:25:02.835963 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:25:02.837988 master-0 kubenswrapper[15202]: I0319 09:25:02.835984 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:25:02.838145 master-0 kubenswrapper[15202]: I0319 09:25:02.836000 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:25:02.838190 master-0 kubenswrapper[15202]: I0319 09:25:02.836063 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:25:02.838287 master-0 kubenswrapper[15202]: I0319 09:25:02.836077 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:25:02.838421 master-0 kubenswrapper[15202]: I0319 09:25:02.836128 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:25:02.838559 master-0 kubenswrapper[15202]: I0319 09:25:02.836211 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:25:02.838639 master-0 kubenswrapper[15202]: I0319 09:25:02.838607 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:25:02.838789 master-0 kubenswrapper[15202]: I0319 09:25:02.838772 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:25:02.839048 master-0 kubenswrapper[15202]: I0319 09:25:02.839007 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:25:02.839756 master-0 kubenswrapper[15202]: I0319 09:25:02.839712 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:25:02.841934 master-0 kubenswrapper[15202]: I0319 09:25:02.841896 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:25:02.842407 master-0 kubenswrapper[15202]: I0319 09:25:02.842329 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:25:02.842614 master-0 kubenswrapper[15202]: I0319 09:25:02.842571 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:25:02.842877 master-0 kubenswrapper[15202]: I0319 09:25:02.842826 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:25:02.844123 master-0 kubenswrapper[15202]: I0319 09:25:02.844098 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:25:02.844327 master-0 kubenswrapper[15202]: I0319 09:25:02.844290 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:25:02.844492 master-0 kubenswrapper[15202]: I0319 09:25:02.844427 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:25:02.846119 master-0 kubenswrapper[15202]: I0319 09:25:02.846027 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:25:02.847670 master-0 kubenswrapper[15202]: I0319 09:25:02.847608 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:25:02.848195 master-0 kubenswrapper[15202]: I0319 09:25:02.848125 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 09:25:02.848195 master-0 kubenswrapper[15202]: I0319 09:25:02.848171 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:25:02.849030 master-0 kubenswrapper[15202]: I0319 09:25:02.848995 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:25:02.856217 master-0 kubenswrapper[15202]: I0319 09:25:02.856169 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:25:02.857339 master-0 kubenswrapper[15202]: I0319 09:25:02.857290 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:25:02.858703 master-0 kubenswrapper[15202]: I0319 09:25:02.858671 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:25:02.859273 master-0 kubenswrapper[15202]: I0319 09:25:02.859217 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:25:02.873971 master-0 kubenswrapper[15202]: I0319 09:25:02.873912 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:25:02.882755 master-0 kubenswrapper[15202]: I0319 09:25:02.882687 15202 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 09:25:02.899767 master-0 kubenswrapper[15202]: I0319 09:25:02.899701 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:25:02.914386 master-0 kubenswrapper[15202]: I0319 09:25:02.914321 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:25:02.934671 master-0 kubenswrapper[15202]: I0319 09:25:02.934628 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:25:02.954094 master-0 kubenswrapper[15202]: I0319 09:25:02.954046 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:25:02.975319 master-0 kubenswrapper[15202]: I0319 09:25:02.975269 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:25:02.996544 master-0 kubenswrapper[15202]: I0319 09:25:02.995025 15202 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 09:25:02.999322 master-0 kubenswrapper[15202]: I0319 09:25:02.999273 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:25:03.019840 master-0 kubenswrapper[15202]: I0319 09:25:03.019718 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:25:03.033603 master-0 kubenswrapper[15202]: I0319 09:25:03.033543 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:25:03.054047 master-0 kubenswrapper[15202]: I0319 09:25:03.054000 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:25:03.073689 master-0 kubenswrapper[15202]: I0319 09:25:03.073645 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:25:03.094556 master-0 kubenswrapper[15202]: I0319 09:25:03.094508 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:25:03.095872 master-0 kubenswrapper[15202]: I0319 09:25:03.095829 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtcq\" (UniqueName: \"kubernetes.io/projected/467c2f01-2c23-41e2-acb9-08a84061fefc-kube-api-access-mxtcq\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:03.095998 master-0 kubenswrapper[15202]: I0319 09:25:03.095876 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:03.095998 master-0 kubenswrapper[15202]: I0319 09:25:03.095900 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.095998 master-0 kubenswrapper[15202]: I0319 09:25:03.095917 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-utilities\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:03.095998 master-0 kubenswrapper[15202]: I0319 09:25:03.095939 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:03.095998 master-0 kubenswrapper[15202]: I0319 09:25:03.095974 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.096282 master-0 kubenswrapper[15202]: I0319 09:25:03.095999 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:25:03.096282 master-0 kubenswrapper[15202]: I0319 09:25:03.096028 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:25:03.096282 master-0 kubenswrapper[15202]: I0319 09:25:03.096217 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-utilities\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:03.096397 master-0 kubenswrapper[15202]: I0319 09:25:03.096350 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-config\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.096620 master-0 kubenswrapper[15202]: I0319 09:25:03.096497 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:03.096714 master-0 kubenswrapper[15202]: I0319 09:25:03.096671 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.096761 master-0 kubenswrapper[15202]: I0319 09:25:03.096587 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-operand-assets\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:25:03.096761 master-0 kubenswrapper[15202]: I0319 09:25:03.096738 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.096886 master-0 kubenswrapper[15202]: I0319 09:25:03.096789 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:25:03.097033 master-0 kubenswrapper[15202]: I0319 09:25:03.096970 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.097090 master-0 kubenswrapper[15202]: I0319 09:25:03.097048 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.097090 master-0 kubenswrapper[15202]: I0319 09:25:03.097055 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.097169 master-0 kubenswrapper[15202]: I0319 09:25:03.097093 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:25:03.097169 master-0 kubenswrapper[15202]: I0319 09:25:03.097125 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d90f590a-6118-4769-b18f-fec67dd62c20-signing-cabundle\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:25:03.097169 master-0 kubenswrapper[15202]: I0319 09:25:03.097125 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:03.097169 master-0 kubenswrapper[15202]: I0319 09:25:03.097151 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7x89\" (UniqueName: \"kubernetes.io/projected/0cb70a30-a8d1-4037-81e6-eb4f0510a234-kube-api-access-q7x89\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:03.097169 master-0 kubenswrapper[15202]: I0319 09:25:03.097174 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfsx\" (UniqueName: \"kubernetes.io/projected/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-kube-api-access-rnfsx\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.097423 master-0 kubenswrapper[15202]: I0319 09:25:03.097388 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-utilities\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:03.097461 master-0 kubenswrapper[15202]: I0319 09:25:03.097432 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:25:03.097461 master-0 kubenswrapper[15202]: I0319 09:25:03.097432 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz7d6\" (UniqueName: \"kubernetes.io/projected/31742478-0d83-48cf-b38b-02416d95d4a8-kube-api-access-wz7d6\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:25:03.097565 master-0 kubenswrapper[15202]: I0319 09:25:03.097454 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d90f590a-6118-4769-b18f-fec67dd62c20-signing-cabundle\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:25:03.097565 master-0 kubenswrapper[15202]: I0319 09:25:03.097490 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-catalog-content\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:03.097565 master-0 kubenswrapper[15202]: I0319 09:25:03.097515 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:25:03.097565 master-0 kubenswrapper[15202]: I0319 09:25:03.097527 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-utilities\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:03.097565 master-0 kubenswrapper[15202]: I0319 09:25:03.097539 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9dj\" (UniqueName: \"kubernetes.io/projected/3a4fd337-c385-4f56-965c-d68ee0a4e848-kube-api-access-vr9dj\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.097565 master-0 kubenswrapper[15202]: I0319 09:25:03.097560 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:25:03.097752 master-0 kubenswrapper[15202]: I0319 09:25:03.097581 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2ntw\" (UniqueName: \"kubernetes.io/projected/b2bff8a5-c45d-4d28-8771-2239ad0fa578-kube-api-access-s2ntw\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.097752 master-0 kubenswrapper[15202]: I0319 09:25:03.097601 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.097752 master-0 kubenswrapper[15202]: I0319 09:25:03.097634 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:25:03.097752 master-0 kubenswrapper[15202]: I0319 09:25:03.097666 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.097907 master-0 kubenswrapper[15202]: I0319 09:25:03.097748 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39bf78ac-304b-4b82-8729-d184657ef3bb-catalog-content\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:03.097907 master-0 kubenswrapper[15202]: I0319 09:25:03.097884 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-config\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:25:03.097907 master-0 kubenswrapper[15202]: I0319 09:25:03.097890 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/310d604b-fe9a-4b19-b8b5-7a1983e45e67-config\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:25:03.098033 master-0 kubenswrapper[15202]: I0319 09:25:03.097897 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.098033 master-0 kubenswrapper[15202]: I0319 09:25:03.098007 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:25:03.098033 master-0 kubenswrapper[15202]: I0319 09:25:03.098013 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.098143 master-0 kubenswrapper[15202]: I0319 09:25:03.098032 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d486ce23-acf7-429a-9739-4770e1a2bf78-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:25:03.098143 master-0 kubenswrapper[15202]: I0319 09:25:03.098060 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.098143 master-0 kubenswrapper[15202]: I0319 09:25:03.098076 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.098143 master-0 kubenswrapper[15202]: I0319 09:25:03.098085 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:03.098143 master-0 kubenswrapper[15202]: I0319 09:25:03.098114 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:03.098143 master-0 kubenswrapper[15202]: I0319 09:25:03.098142 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljb2\" (UniqueName: \"kubernetes.io/projected/d90f590a-6118-4769-b18f-fec67dd62c20-kube-api-access-nljb2\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098170 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098232 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a823c8bc-09ef-46a9-a1f3-155a34b89788-config\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098238 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098290 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098324 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098353 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:25:03.098460 master-0 kubenswrapper[15202]: I0319 09:25:03.098404 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.098751 master-0 kubenswrapper[15202]: I0319 09:25:03.098489 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-ovnkube-identity-cm\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.098751 master-0 kubenswrapper[15202]: I0319 09:25:03.098499 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfw5k\" (UniqueName: \"kubernetes.io/projected/f93b8728-4a33-4ee4-b7c6-cff7d7995953-kube-api-access-kfw5k\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:03.098751 master-0 kubenswrapper[15202]: I0319 09:25:03.098539 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:25:03.098751 master-0 kubenswrapper[15202]: I0319 09:25:03.098608 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:03.098751 master-0 kubenswrapper[15202]: I0319 09:25:03.098665 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.098751 master-0 kubenswrapper[15202]: I0319 09:25:03.098701 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.098978 master-0 kubenswrapper[15202]: I0319 09:25:03.098758 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.098978 master-0 kubenswrapper[15202]: I0319 09:25:03.098797 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:03.098978 master-0 kubenswrapper[15202]: I0319 09:25:03.098857 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxfs\" (UniqueName: \"kubernetes.io/projected/f1943401-a75b-4e45-8c65-3cc36018d8c4-kube-api-access-8cxfs\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:03.098978 master-0 kubenswrapper[15202]: I0319 09:25:03.098911 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdjs\" (UniqueName: \"kubernetes.io/projected/d486ce23-acf7-429a-9739-4770e1a2bf78-kube-api-access-bzdjs\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:25:03.098978 master-0 kubenswrapper[15202]: I0319 09:25:03.098946 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.099173 master-0 kubenswrapper[15202]: I0319 09:25:03.099000 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.099173 master-0 kubenswrapper[15202]: I0319 09:25:03.099038 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:25:03.099173 master-0 kubenswrapper[15202]: I0319 09:25:03.099124 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.099173 master-0 kubenswrapper[15202]: I0319 09:25:03.099089 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bec90db1-02e3-4211-8c33-f8bcc304e3a7-iptables-alerter-script\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:03.099327 master-0 kubenswrapper[15202]: I0319 09:25:03.099186 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.099327 master-0 kubenswrapper[15202]: I0319 09:25:03.099221 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:25:03.099327 master-0 kubenswrapper[15202]: I0319 09:25:03.099279 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:25:03.099577 master-0 kubenswrapper[15202]: I0319 09:25:03.099313 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:25:03.099577 master-0 kubenswrapper[15202]: I0319 09:25:03.099373 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.099577 master-0 kubenswrapper[15202]: I0319 09:25:03.099430 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhwd7\" (UniqueName: \"kubernetes.io/projected/141cb120-92da-4d8d-bc29-fc4c433a6336-kube-api-access-fhwd7\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:25:03.099577 master-0 kubenswrapper[15202]: I0319 09:25:03.099510 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-client\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.099577 master-0 kubenswrapper[15202]: I0319 09:25:03.099549 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d90f590a-6118-4769-b18f-fec67dd62c20-signing-key\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:25:03.099808 master-0 kubenswrapper[15202]: I0319 09:25:03.099684 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.099808 master-0 kubenswrapper[15202]: I0319 09:25:03.099744 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.099808 master-0 kubenswrapper[15202]: I0319 09:25:03.099780 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsdjh\" (UniqueName: \"kubernetes.io/projected/f0d16aa2-494d-4a65-880d-3d87219178b5-kube-api-access-fsdjh\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:03.099932 master-0 kubenswrapper[15202]: I0319 09:25:03.099812 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc9945ac-4041-4120-b504-a173c2bf91bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.099932 master-0 kubenswrapper[15202]: I0319 09:25:03.099807 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d90f590a-6118-4769-b18f-fec67dd62c20-signing-key\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:25:03.100027 master-0 kubenswrapper[15202]: I0319 09:25:03.100000 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:25:03.100074 master-0 kubenswrapper[15202]: I0319 09:25:03.100046 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.100131 master-0 kubenswrapper[15202]: I0319 09:25:03.100093 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/1dd59466-0133-41fe-a648-28db73aa861b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.100174 master-0 kubenswrapper[15202]: I0319 09:25:03.100158 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:03.100221 master-0 kubenswrapper[15202]: I0319 09:25:03.100184 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rltcj\" (UniqueName: \"kubernetes.io/projected/39bf78ac-304b-4b82-8729-d184657ef3bb-kube-api-access-rltcj\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:03.100221 master-0 kubenswrapper[15202]: I0319 09:25:03.100207 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.100301 master-0 kubenswrapper[15202]: I0319 09:25:03.100229 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.100301 master-0 kubenswrapper[15202]: I0319 09:25:03.100250 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.100386 master-0 kubenswrapper[15202]: I0319 09:25:03.100278 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:03.100499 master-0 kubenswrapper[15202]: I0319 09:25:03.100454 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.100558 master-0 kubenswrapper[15202]: I0319 09:25:03.100503 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.100558 master-0 kubenswrapper[15202]: I0319 09:25:03.100529 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.100558 master-0 kubenswrapper[15202]: I0319 09:25:03.100552 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.100687 master-0 kubenswrapper[15202]: I0319 09:25:03.100583 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:25:03.100687 master-0 kubenswrapper[15202]: I0319 09:25:03.100624 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.100687 master-0 kubenswrapper[15202]: I0319 09:25:03.100657 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.100840 master-0 kubenswrapper[15202]: I0319 09:25:03.100657 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.100840 master-0 kubenswrapper[15202]: I0319 09:25:03.100704 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:25:03.100840 master-0 kubenswrapper[15202]: I0319 09:25:03.100769 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:25:03.100965 master-0 kubenswrapper[15202]: I0319 09:25:03.100868 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db42b38e-294e-4016-8ac1-54126ac60de8-cache\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.100965 master-0 kubenswrapper[15202]: I0319 09:25:03.100921 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:03.101044 master-0 kubenswrapper[15202]: I0319 09:25:03.100960 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.101044 master-0 kubenswrapper[15202]: I0319 09:25:03.101017 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.101044 master-0 kubenswrapper[15202]: I0319 09:25:03.101023 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db42b38e-294e-4016-8ac1-54126ac60de8-cache\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.101158 master-0 kubenswrapper[15202]: I0319 09:25:03.101066 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:03.101158 master-0 kubenswrapper[15202]: I0319 09:25:03.101089 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ac42112-6a00-4c17-b230-75b565aa668f-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.101158 master-0 kubenswrapper[15202]: I0319 09:25:03.101106 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:03.101158 master-0 kubenswrapper[15202]: I0319 09:25:03.101134 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txp58\" (UniqueName: \"kubernetes.io/projected/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-kube-api-access-txp58\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:03.101306 master-0 kubenswrapper[15202]: I0319 09:25:03.101137 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:03.101306 master-0 kubenswrapper[15202]: I0319 09:25:03.101163 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.101306 master-0 kubenswrapper[15202]: I0319 09:25:03.101229 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.101306 master-0 kubenswrapper[15202]: I0319 09:25:03.101263 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n49x9\" (UniqueName: \"kubernetes.io/projected/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-kube-api-access-n49x9\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:03.101306 master-0 kubenswrapper[15202]: I0319 09:25:03.101299 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0cb70a30-a8d1-4037-81e6-eb4f0510a234-snapshots\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:03.101560 master-0 kubenswrapper[15202]: I0319 09:25:03.101336 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-encryption-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.101560 master-0 kubenswrapper[15202]: I0319 09:25:03.101351 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a75049de-dcf1-4102-b339-f45d5015adea-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:25:03.101560 master-0 kubenswrapper[15202]: I0319 09:25:03.101393 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0cb70a30-a8d1-4037-81e6-eb4f0510a234-snapshots\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:03.101560 master-0 kubenswrapper[15202]: I0319 09:25:03.101377 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52fa1ad-0071-4506-bb94-e73d6f15a75c-hosts-file\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:25:03.101560 master-0 kubenswrapper[15202]: I0319 09:25:03.101525 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:25:03.101706 master-0 kubenswrapper[15202]: I0319 09:25:03.101581 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:03.101706 master-0 kubenswrapper[15202]: I0319 09:25:03.101630 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:25:03.101791 master-0 kubenswrapper[15202]: I0319 09:25:03.101760 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:25:03.101858 master-0 kubenswrapper[15202]: I0319 09:25:03.101830 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:25:03.101915 master-0 kubenswrapper[15202]: I0319 09:25:03.101889 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-catalog-content\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:03.101954 master-0 kubenswrapper[15202]: I0319 09:25:03.101896 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/083882c0-ea2f-4405-8cf1-cce5b91fe602-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:25:03.101989 master-0 kubenswrapper[15202]: I0319 09:25:03.101953 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.101989 master-0 kubenswrapper[15202]: I0319 09:25:03.101956 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-catalog-content\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:03.102048 master-0 kubenswrapper[15202]: I0319 09:25:03.101997 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.102048 master-0 kubenswrapper[15202]: I0319 09:25:03.102036 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:03.102103 master-0 kubenswrapper[15202]: I0319 09:25:03.102071 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.102140 master-0 kubenswrapper[15202]: I0319 09:25:03.102119 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.102208 master-0 kubenswrapper[15202]: I0319 09:25:03.102179 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.102342 master-0 kubenswrapper[15202]: I0319 09:25:03.102313 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:25:03.102454 master-0 kubenswrapper[15202]: I0319 09:25:03.102395 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdlx\" (UniqueName: \"kubernetes.io/projected/b8f39c16-3a94-45c3-a51c-f2e81eff967d-kube-api-access-qmdlx\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:03.102454 master-0 kubenswrapper[15202]: I0319 09:25:03.102426 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dwx6\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-kube-api-access-8dwx6\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.102454 master-0 kubenswrapper[15202]: I0319 09:25:03.102453 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:03.102454 master-0 kubenswrapper[15202]: I0319 09:25:03.102485 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102508 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102542 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102583 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc9945ac-4041-4120-b504-a173c2bf91bd-serving-cert\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102598 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-config\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102611 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-sys\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102653 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/208939f5-8fca-4fd5-b0c6-43484b7d1e30-srv-cert\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102676 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102879 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a8e2194-aba6-4929-a29c-47c63c8ff799-metrics-tls\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:03.102894 master-0 kubenswrapper[15202]: I0319 09:25:03.102886 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvp9m\" (UniqueName: \"kubernetes.io/projected/d32541c9-eef6-417c-9f5a-a7392dc70aa0-kube-api-access-fvp9m\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:03.103235 master-0 kubenswrapper[15202]: I0319 09:25:03.102961 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit-dir\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.103235 master-0 kubenswrapper[15202]: I0319 09:25:03.103004 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-host\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.103235 master-0 kubenswrapper[15202]: I0319 09:25:03.103049 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.103235 master-0 kubenswrapper[15202]: I0319 09:25:03.103096 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-utilities\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:03.103235 master-0 kubenswrapper[15202]: I0319 09:25:03.103220 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-utilities\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:03.103428 master-0 kubenswrapper[15202]: I0319 09:25:03.103278 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f8fdab32-4e61-4e9c-a506-52121f625669-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:25:03.103428 master-0 kubenswrapper[15202]: I0319 09:25:03.103332 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-env-overrides\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.103428 master-0 kubenswrapper[15202]: I0319 09:25:03.103338 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:03.103428 master-0 kubenswrapper[15202]: I0319 09:25:03.103392 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103439 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-serving-ca\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103512 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103554 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103597 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103641 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-tuned\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103686 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103732 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103782 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103812 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/310d604b-fe9a-4b19-b8b5-7a1983e45e67-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103817 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-tuned\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.103888 master-0 kubenswrapper[15202]: I0319 09:25:03.103833 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.104914 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cni-binary-copy\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.104964 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105054 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105098 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105128 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc9945ac-4041-4120-b504-a173c2bf91bd-service-ca\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105151 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105177 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105200 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105222 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105243 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-serving-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105306 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-modprobe-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105331 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-kubernetes\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105359 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105382 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105404 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f0d16aa2-494d-4a65-880d-3d87219178b5-tmpfs\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105426 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105450 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105555 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105591 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105620 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-encryption-config\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105649 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105679 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105702 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105724 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/467c2f01-2c23-41e2-acb9-08a84061fefc-rootfs\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105747 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105771 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105794 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-systemd\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105816 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105840 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105868 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105891 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105909 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105935 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.105981 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106006 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106029 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106065 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106088 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106111 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106132 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-run\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106522 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-conf\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106561 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdb6\" (UniqueName: \"kubernetes.io/projected/cd42096c-f18d-4bb5-8a51-8761dc1edb73-kube-api-access-dxdb6\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106589 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106613 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-image-import-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106631 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106654 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106677 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt22g\" (UniqueName: \"kubernetes.io/projected/9ca444a4-4d78-456f-9656-0c28076ce77e-kube-api-access-kt22g\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106697 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106719 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-serving-cert\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106738 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-dir\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106760 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-utilities\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106782 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106815 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1dd59466-0133-41fe-a648-28db73aa861b-cache\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106836 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzntq\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-kube-api-access-gzntq\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106859 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106887 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106910 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106933 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106951 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106972 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.106993 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107017 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107037 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107057 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107078 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107097 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-tmp\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107238 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107258 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-catalog-content\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107280 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107301 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107321 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-node-pullsecrets\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107354 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws5kr\" (UniqueName: \"kubernetes.io/projected/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-kube-api-access-ws5kr\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107380 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107399 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107419 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107440 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107462 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107531 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107550 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107574 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107597 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107618 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107641 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xl5z\" (UniqueName: \"kubernetes.io/projected/f8fdab32-4e61-4e9c-a506-52121f625669-kube-api-access-5xl5z\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107671 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107704 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvg4q\" (UniqueName: \"kubernetes.io/projected/d52fa1ad-0071-4506-bb94-e73d6f15a75c-kube-api-access-xvg4q\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107744 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107776 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107799 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107828 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-trusted-ca-bundle\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107851 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107875 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107893 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107916 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-catalog-content\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107941 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgsm7\" (UniqueName: \"kubernetes.io/projected/e3376275-294d-446d-9b4c-930df60dba01-kube-api-access-cgsm7\") pod \"csi-snapshot-controller-64854d9cff-dzfgb\" (UID: \"e3376275-294d-446d-9b4c-930df60dba01\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107963 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.107988 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.108008 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-policies\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.108034 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.108060 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.107957 master-0 kubenswrapper[15202]: I0319 09:25:03.108084 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108103 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108127 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108154 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108178 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108203 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-client\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108224 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108290 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1dd59466-0133-41fe-a648-28db73aa861b-cache\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: E0319 09:25:03.108730 15202 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.108666 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-catalog-content\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: E0319 09:25:03.108835 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:03.608795673 +0000 UTC m=+20.994210489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.109075 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.109093 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2898746-6827-41d9-ac88-64206cb84ac9-webhook-cert\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.109527 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovn-node-metrics-cert\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.109622 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-script-lib\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.109804 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96902651-8e2b-44c2-be80-0a8c7c28cb58-ovnkube-config\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.109942 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.110090 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.110665 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b36f3b2-caf9-40ad-a3a1-e83796142f54-config\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.110736 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-utilities\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.110878 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4256d841-23cb-4756-b827-f44ee6e54def-metrics-certs\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: E0319 09:25:03.110967 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111056 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357980ba-1957-412f-afb5-04281eca2bee-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111158 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a417fe25-4aca-471c-941d-c195b6141042-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: E0319 09:25:03.111188 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:03.611167948 +0000 UTC m=+20.996582764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111230 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083882c0-ea2f-4405-8cf1-cce5b91fe602-config\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/33e92e5d-61ea-45b2-b357-ebffdaebf4af-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: E0319 09:25:03.111306 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111355 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111405 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1943401-a75b-4e45-8c65-3cc36018d8c4-catalog-content\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111421 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f2148fe-f9f6-47da-894c-b88dae360ebe-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111547 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-tmp\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111671 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: E0319 09:25:03.111359 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:03.611342642 +0000 UTC m=+20.996757458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111743 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-ca\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111774 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a417fe25-4aca-471c-941d-c195b6141042-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.111908 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b36f3b2-caf9-40ad-a3a1-e83796142f54-serving-cert\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112275 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b2898746-6827-41d9-ac88-64206cb84ac9-env-overrides\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112424 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-serving-cert\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112546 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86c4b0e4-3481-465d-b00f-022d2c58c183-config\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112616 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-metrics-tls\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112803 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-client\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112823 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-metrics-tls\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.112956 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8aa0f17a-287e-4a19-9a59-4913e7707071-srv-cert\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113112 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a823c8bc-09ef-46a9-a1f3-155a34b89788-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113323 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75049de-dcf1-4102-b339-f45d5015adea-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113424 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f0d16aa2-494d-4a65-880d-3d87219178b5-tmpfs\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113489 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-daemon-config\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.103874 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a8e2194-aba6-4929-a29c-47c63c8ff799-trusted-ca\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113823 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ac42112-6a00-4c17-b230-75b565aa668f-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113893 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113927 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.113932 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-var-lib-kubelet\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114014 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114018 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114356 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114406 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9663cc40-a69d-42ba-890e-071cb85062f5-etcd-client\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114417 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-serving-cert\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114505 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114648 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-env-overrides\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114813 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357980ba-1957-412f-afb5-04281eca2bee-serving-cert\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.114963 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115024 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115148 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8f39c16-3a94-45c3-a51c-f2e81eff967d-config-volume\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115192 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115352 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115377 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86c4b0e4-3481-465d-b00f-022d2c58c183-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115404 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-serving-cert\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115702 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-lib-modules\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115907 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysconfig\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.115984 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb70a30-a8d1-4037-81e6-eb4f0510a234-serving-cert\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116026 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116035 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116123 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116158 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zz2n\" (UniqueName: \"kubernetes.io/projected/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc-kube-api-access-2zz2n\") pod \"migrator-8487694857-nkvjk\" (UID: \"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116394 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9ebcecb-c210-434e-83a1-825265e206f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116516 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-trusted-ca-bundle\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.116761 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.117338 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.120687 master-0 kubenswrapper[15202]: I0319 09:25:03.117781 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.134278 master-0 kubenswrapper[15202]: I0319 09:25:03.134230 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:25:03.141875 master-0 kubenswrapper[15202]: I0319 09:25:03.141810 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-encryption-config\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.155679 master-0 kubenswrapper[15202]: I0319 09:25:03.155582 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:25:03.156102 master-0 kubenswrapper[15202]: I0319 09:25:03.156042 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-serving-cert\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.174159 master-0 kubenswrapper[15202]: I0319 09:25:03.174096 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:25:03.186544 master-0 kubenswrapper[15202]: I0319 09:25:03.186494 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a4fd337-c385-4f56-965c-d68ee0a4e848-encryption-config\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.194792 master-0 kubenswrapper[15202]: I0319 09:25:03.194731 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:25:03.203989 master-0 kubenswrapper[15202]: I0319 09:25:03.203925 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-image-import-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.214879 master-0 kubenswrapper[15202]: I0319 09:25:03.214817 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:25:03.218681 master-0 kubenswrapper[15202]: I0319 09:25:03.218620 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.218756 master-0 kubenswrapper[15202]: I0319 09:25:03.218698 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/467c2f01-2c23-41e2-acb9-08a84061fefc-rootfs\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:03.218806 master-0 kubenswrapper[15202]: I0319 09:25:03.218769 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/467c2f01-2c23-41e2-acb9-08a84061fefc-rootfs\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:03.218806 master-0 kubenswrapper[15202]: I0319 09:25:03.218786 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-hostroot\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.218870 master-0 kubenswrapper[15202]: I0319 09:25:03.218816 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.218870 master-0 kubenswrapper[15202]: I0319 09:25:03.218844 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-netns\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.218930 master-0 kubenswrapper[15202]: I0319 09:25:03.218891 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-systemd\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.219014 master-0 kubenswrapper[15202]: I0319 09:25:03.218989 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.219066 master-0 kubenswrapper[15202]: I0319 09:25:03.219048 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219105 master-0 kubenswrapper[15202]: I0319 09:25:03.219094 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.219163 master-0 kubenswrapper[15202]: I0319 09:25:03.219122 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-systemd\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.219201 master-0 kubenswrapper[15202]: I0319 09:25:03.219131 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-run\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.219230 master-0 kubenswrapper[15202]: I0319 09:25:03.219217 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.219261 master-0 kubenswrapper[15202]: I0319 09:25:03.219189 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-run\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.219291 master-0 kubenswrapper[15202]: I0319 09:25:03.219267 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-conf\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.219291 master-0 kubenswrapper[15202]: I0319 09:25:03.219277 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-netd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219349 master-0 kubenswrapper[15202]: I0319 09:25:03.219335 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-dir\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.219379 master-0 kubenswrapper[15202]: I0319 09:25:03.219350 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-multus-certs\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.219411 master-0 kubenswrapper[15202]: I0319 09:25:03.219388 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-dir\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.219508 master-0 kubenswrapper[15202]: I0319 09:25:03.219458 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.219596 master-0 kubenswrapper[15202]: I0319 09:25:03.219571 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-conf\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.219596 master-0 kubenswrapper[15202]: I0319 09:25:03.219580 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-node-pullsecrets\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.219670 master-0 kubenswrapper[15202]: I0319 09:25:03.219623 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.219670 master-0 kubenswrapper[15202]: I0319 09:25:03.219631 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-node-pullsecrets\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.219726 master-0 kubenswrapper[15202]: I0319 09:25:03.219674 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-bin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.219754 master-0 kubenswrapper[15202]: I0319 09:25:03.219731 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219789 master-0 kubenswrapper[15202]: I0319 09:25:03.219767 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:03.219789 master-0 kubenswrapper[15202]: I0319 09:25:03.219766 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-socket-dir-parent\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.219844 master-0 kubenswrapper[15202]: I0319 09:25:03.219798 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219844 master-0 kubenswrapper[15202]: I0319 09:25:03.219830 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-host-etc-kube\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:03.219844 master-0 kubenswrapper[15202]: I0319 09:25:03.219835 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219929 master-0 kubenswrapper[15202]: I0319 09:25:03.219856 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-var-lib-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219929 master-0 kubenswrapper[15202]: I0319 09:25:03.219869 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.219929 master-0 kubenswrapper[15202]: I0319 09:25:03.219897 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.220010 master-0 kubenswrapper[15202]: I0319 09:25:03.219973 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.220010 master-0 kubenswrapper[15202]: I0319 09:25:03.219979 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.220070 master-0 kubenswrapper[15202]: I0319 09:25:03.220023 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-ovn\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.220122 master-0 kubenswrapper[15202]: I0319 09:25:03.220102 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.220153 master-0 kubenswrapper[15202]: I0319 09:25:03.220135 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.220182 master-0 kubenswrapper[15202]: I0319 09:25:03.220165 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-var-lib-kubelet\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.220214 master-0 kubenswrapper[15202]: I0319 09:25:03.220191 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.220277 master-0 kubenswrapper[15202]: I0319 09:25:03.220106 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.220312 master-0 kubenswrapper[15202]: I0319 09:25:03.220268 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.220343 master-0 kubenswrapper[15202]: I0319 09:25:03.220313 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.220343 master-0 kubenswrapper[15202]: I0319 09:25:03.220323 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-client\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.220343 master-0 kubenswrapper[15202]: I0319 09:25:03.220327 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.220429 master-0 kubenswrapper[15202]: I0319 09:25:03.220347 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-system-cni-dir\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.220429 master-0 kubenswrapper[15202]: I0319 09:25:03.220358 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-lib-modules\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.220429 master-0 kubenswrapper[15202]: I0319 09:25:03.220405 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysconfig\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.220548 master-0 kubenswrapper[15202]: I0319 09:25:03.220457 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:03.220548 master-0 kubenswrapper[15202]: I0319 09:25:03.220517 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.220548 master-0 kubenswrapper[15202]: I0319 09:25:03.220537 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-lib-modules\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.220868 master-0 kubenswrapper[15202]: I0319 09:25:03.220822 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.220905 master-0 kubenswrapper[15202]: I0319 09:25:03.220845 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysconfig\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.220905 master-0 kubenswrapper[15202]: I0319 09:25:03.220896 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-etc-kubernetes\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221022 master-0 kubenswrapper[15202]: I0319 09:25:03.220969 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.221110 master-0 kubenswrapper[15202]: I0319 09:25:03.221087 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221154 master-0 kubenswrapper[15202]: I0319 09:25:03.221127 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-var-lib-kubelet\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.221185 master-0 kubenswrapper[15202]: I0319 09:25:03.221162 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:03.221218 master-0 kubenswrapper[15202]: I0319 09:25:03.221193 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-sysctl-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.221218 master-0 kubenswrapper[15202]: I0319 09:25:03.221192 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.221277 master-0 kubenswrapper[15202]: I0319 09:25:03.221227 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221277 master-0 kubenswrapper[15202]: I0319 09:25:03.221236 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.221277 master-0 kubenswrapper[15202]: I0319 09:25:03.221255 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221371 master-0 kubenswrapper[15202]: I0319 09:25:03.221293 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221371 master-0 kubenswrapper[15202]: I0319 09:25:03.221294 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bec90db1-02e3-4211-8c33-f8bcc304e3a7-host-slash\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:03.221371 master-0 kubenswrapper[15202]: I0319 09:25:03.221328 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-run-netns\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221371 master-0 kubenswrapper[15202]: I0319 09:25:03.221335 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-system-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221371 master-0 kubenswrapper[15202]: I0319 09:25:03.221363 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-etc-openvswitch\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221373 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-cni-bin\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221386 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221408 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221430 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-node-log\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221461 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221507 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.221527 master-0 kubenswrapper[15202]: I0319 09:25:03.221516 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-os-release\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221714 master-0 kubenswrapper[15202]: I0319 09:25:03.221561 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-cni-multus\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221714 master-0 kubenswrapper[15202]: I0319 09:25:03.221612 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221714 master-0 kubenswrapper[15202]: I0319 09:25:03.221645 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.221714 master-0 kubenswrapper[15202]: I0319 09:25:03.221688 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-slash\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.221714 master-0 kubenswrapper[15202]: I0319 09:25:03.221693 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221845 master-0 kubenswrapper[15202]: I0319 09:25:03.221718 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221845 master-0 kubenswrapper[15202]: I0319 09:25:03.221751 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-cnibin\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221845 master-0 kubenswrapper[15202]: I0319 09:25:03.221783 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221845 master-0 kubenswrapper[15202]: I0319 09:25:03.221832 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.221953 master-0 kubenswrapper[15202]: I0319 09:25:03.221866 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.221953 master-0 kubenswrapper[15202]: I0319 09:25:03.221888 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-cni-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.221953 master-0 kubenswrapper[15202]: I0319 09:25:03.221951 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-run-k8s-cni-cncf-io\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.222040 master-0 kubenswrapper[15202]: I0319 09:25:03.221956 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1dd59466-0133-41fe-a648-28db73aa861b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:03.222040 master-0 kubenswrapper[15202]: I0319 09:25:03.221984 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-cnibin\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.222040 master-0 kubenswrapper[15202]: I0319 09:25:03.222015 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.222124 master-0 kubenswrapper[15202]: I0319 09:25:03.222046 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.222124 master-0 kubenswrapper[15202]: I0319 09:25:03.222076 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc9945ac-4041-4120-b504-a173c2bf91bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.222124 master-0 kubenswrapper[15202]: I0319 09:25:03.222116 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-multus-conf-dir\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.222207 master-0 kubenswrapper[15202]: I0319 09:25:03.222117 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222207 master-0 kubenswrapper[15202]: I0319 09:25:03.222145 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-log-socket\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222207 master-0 kubenswrapper[15202]: I0319 09:25:03.222173 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.222296 master-0 kubenswrapper[15202]: I0319 09:25:03.222214 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52fa1ad-0071-4506-bb94-e73d6f15a75c-hosts-file\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:25:03.222296 master-0 kubenswrapper[15202]: I0319 09:25:03.222287 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52fa1ad-0071-4506-bb94-e73d6f15a75c-hosts-file\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:25:03.222362 master-0 kubenswrapper[15202]: I0319 09:25:03.222293 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222362 master-0 kubenswrapper[15202]: I0319 09:25:03.222318 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"installer-2-master-0\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.222362 master-0 kubenswrapper[15202]: I0319 09:25:03.222321 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.222362 master-0 kubenswrapper[15202]: I0319 09:25:03.222358 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222492 master-0 kubenswrapper[15202]: I0319 09:25:03.222365 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9ebcecb-c210-434e-83a1-825265e206f1-os-release\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:03.222492 master-0 kubenswrapper[15202]: I0319 09:25:03.222382 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222492 master-0 kubenswrapper[15202]: I0319 09:25:03.222425 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-run-systemd\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222492 master-0 kubenswrapper[15202]: I0319 09:25:03.222439 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.222492 master-0 kubenswrapper[15202]: I0319 09:25:03.222458 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-systemd-units\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222522 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222534 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/872e5f8c-b014-4283-a4d2-0e2cfd29e192-host-var-lib-kubelet\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222597 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-sys\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222624 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db42b38e-294e-4016-8ac1-54126ac60de8-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222643 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96902651-8e2b-44c2-be80-0a8c7c28cb58-host-kubelet\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222676 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit-dir\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222693 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-host\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.222749 master-0 kubenswrapper[15202]: I0319 09:25:03.222726 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit-dir\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.222962 master-0 kubenswrapper[15202]: I0319 09:25:03.222818 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-kubernetes\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.222962 master-0 kubenswrapper[15202]: I0319 09:25:03.222855 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-modprobe-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.223029 master-0 kubenswrapper[15202]: I0319 09:25:03.223011 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-modprobe-d\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.223074 master-0 kubenswrapper[15202]: I0319 09:25:03.223041 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-host\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.223074 master-0 kubenswrapper[15202]: I0319 09:25:03.223072 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-sys\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.223146 master-0 kubenswrapper[15202]: I0319 09:25:03.223124 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-etc-kubernetes\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:03.233716 master-0 kubenswrapper[15202]: I0319 09:25:03.233661 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:25:03.254256 master-0 kubenswrapper[15202]: I0319 09:25:03.254197 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:25:03.258862 master-0 kubenswrapper[15202]: I0319 09:25:03.258817 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-audit\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.273795 master-0 kubenswrapper[15202]: I0319 09:25:03.273695 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:25:03.282355 master-0 kubenswrapper[15202]: I0319 09:25:03.282304 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-etcd-serving-ca\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.302290 master-0 kubenswrapper[15202]: I0319 09:25:03.301277 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:25:03.302290 master-0 kubenswrapper[15202]: I0319 09:25:03.301978 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a4fd337-c385-4f56-965c-d68ee0a4e848-trusted-ca-bundle\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:03.313561 master-0 kubenswrapper[15202]: I0319 09:25:03.313502 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:25:03.321263 master-0 kubenswrapper[15202]: I0319 09:25:03.321218 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2bff8a5-c45d-4d28-8771-2239ad0fa578-serving-cert\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.333888 master-0 kubenswrapper[15202]: I0319 09:25:03.333842 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:25:03.354417 master-0 kubenswrapper[15202]: I0319 09:25:03.354335 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:25:03.361517 master-0 kubenswrapper[15202]: I0319 09:25:03.361436 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-audit-policies\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.374254 master-0 kubenswrapper[15202]: I0319 09:25:03.374184 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:25:03.379035 master-0 kubenswrapper[15202]: I0319 09:25:03.377599 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-trusted-ca-bundle\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.393650 master-0 kubenswrapper[15202]: I0319 09:25:03.393598 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:25:03.394091 master-0 kubenswrapper[15202]: I0319 09:25:03.394028 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b2bff8a5-c45d-4d28-8771-2239ad0fa578-etcd-serving-ca\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:03.414732 master-0 kubenswrapper[15202]: I0319 09:25:03.414684 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:25:03.435313 master-0 kubenswrapper[15202]: I0319 09:25:03.435269 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:25:03.454971 master-0 kubenswrapper[15202]: I0319 09:25:03.454920 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:25:03.476055 master-0 kubenswrapper[15202]: I0319 09:25:03.475738 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:03.477584 master-0 kubenswrapper[15202]: I0319 09:25:03.477301 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:25:03.494271 master-0 kubenswrapper[15202]: I0319 09:25:03.494216 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:25:03.498497 master-0 kubenswrapper[15202]: I0319 09:25:03.498445 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8f39c16-3a94-45c3-a51c-f2e81eff967d-metrics-tls\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:03.513590 master-0 kubenswrapper[15202]: I0319 09:25:03.513516 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:25:03.523962 master-0 kubenswrapper[15202]: I0319 09:25:03.523824 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc9945ac-4041-4120-b504-a173c2bf91bd-serving-cert\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.526269 master-0 kubenswrapper[15202]: I0319 09:25:03.526191 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") pod \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " Mar 19 09:25:03.526387 master-0 kubenswrapper[15202]: I0319 09:25:03.526279 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") pod \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " Mar 19 09:25:03.526387 master-0 kubenswrapper[15202]: I0319 09:25:03.526312 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock" (OuterVolumeSpecName: "var-lock") pod "89be0036-a2c8-48b4-9eaf-17fab972c4f4" (UID: "89be0036-a2c8-48b4-9eaf-17fab972c4f4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:03.526548 master-0 kubenswrapper[15202]: I0319 09:25:03.526460 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "89be0036-a2c8-48b4-9eaf-17fab972c4f4" (UID: "89be0036-a2c8-48b4-9eaf-17fab972c4f4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:03.527668 master-0 kubenswrapper[15202]: I0319 09:25:03.527589 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:03.527668 master-0 kubenswrapper[15202]: I0319 09:25:03.527623 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:03.534296 master-0 kubenswrapper[15202]: I0319 09:25:03.534236 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:25:03.536045 master-0 kubenswrapper[15202]: I0319 09:25:03.535957 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8f39c16-3a94-45c3-a51c-f2e81eff967d-config-volume\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:03.555212 master-0 kubenswrapper[15202]: I0319 09:25:03.555045 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:25:03.559931 master-0 kubenswrapper[15202]: I0319 09:25:03.559893 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc9945ac-4041-4120-b504-a173c2bf91bd-service-ca\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:03.573867 master-0 kubenswrapper[15202]: I0319 09:25:03.573822 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:25:03.578145 master-0 kubenswrapper[15202]: I0319 09:25:03.578090 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.594332 master-0 kubenswrapper[15202]: I0319 09:25:03.594241 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:25:03.599381 master-0 kubenswrapper[15202]: I0319 09:25:03.599332 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.613870 master-0 kubenswrapper[15202]: I0319 09:25:03.613804 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:25:03.621236 master-0 kubenswrapper[15202]: I0319 09:25:03.621177 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.629163 master-0 kubenswrapper[15202]: I0319 09:25:03.629102 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.629419 master-0 kubenswrapper[15202]: E0319 09:25:03.629369 15202 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:03.629497 master-0 kubenswrapper[15202]: I0319 09:25:03.629408 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.629497 master-0 kubenswrapper[15202]: E0319 09:25:03.629489 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.629443564 +0000 UTC m=+22.014858380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:03.629618 master-0 kubenswrapper[15202]: E0319 09:25:03.629573 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:03.629665 master-0 kubenswrapper[15202]: E0319 09:25:03.629656 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.629633138 +0000 UTC m=+22.015048154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:03.629847 master-0 kubenswrapper[15202]: I0319 09:25:03.629813 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:03.630077 master-0 kubenswrapper[15202]: E0319 09:25:03.630028 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:03.630159 master-0 kubenswrapper[15202]: E0319 09:25:03.630144 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.630124529 +0000 UTC m=+22.015539345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:03.633618 master-0 kubenswrapper[15202]: I0319 09:25:03.633577 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:25:03.655388 master-0 kubenswrapper[15202]: I0319 09:25:03.655319 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:25:03.680999 master-0 kubenswrapper[15202]: I0319 09:25:03.680931 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:25:03.683155 master-0 kubenswrapper[15202]: I0319 09:25:03.683115 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:03.694181 master-0 kubenswrapper[15202]: I0319 09:25:03.694120 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:25:03.714571 master-0 kubenswrapper[15202]: I0319 09:25:03.714519 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:25:03.722163 master-0 kubenswrapper[15202]: I0319 09:25:03.722068 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.734085 master-0 kubenswrapper[15202]: I0319 09:25:03.734035 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:25:03.741464 master-0 kubenswrapper[15202]: I0319 09:25:03.741418 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.755588 master-0 kubenswrapper[15202]: I0319 09:25:03.755502 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:25:03.763502 master-0 kubenswrapper[15202]: I0319 09:25:03.763427 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:03.778275 master-0 kubenswrapper[15202]: I0319 09:25:03.778161 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:25:03.794701 master-0 kubenswrapper[15202]: I0319 09:25:03.794642 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:25:03.814547 master-0 kubenswrapper[15202]: I0319 09:25:03.814454 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-jwt9n" Mar 19 09:25:03.832369 master-0 kubenswrapper[15202]: I0319 09:25:03.832312 15202 request.go:700] Waited for 1.015806835s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Mar 19 09:25:03.834825 master-0 kubenswrapper[15202]: I0319 09:25:03.834772 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:25:03.844218 master-0 kubenswrapper[15202]: I0319 09:25:03.844186 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f8fdab32-4e61-4e9c-a506-52121f625669-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:25:03.855450 master-0 kubenswrapper[15202]: I0319 09:25:03.855365 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-bgq5z" Mar 19 09:25:03.875346 master-0 kubenswrapper[15202]: I0319 09:25:03.875288 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:25:03.879098 master-0 kubenswrapper[15202]: I0319 09:25:03.879000 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d486ce23-acf7-429a-9739-4770e1a2bf78-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:25:03.896106 master-0 kubenswrapper[15202]: I0319 09:25:03.896019 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:25:03.915723 master-0 kubenswrapper[15202]: I0319 09:25:03.915670 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:25:03.934135 master-0 kubenswrapper[15202]: I0319 09:25:03.934038 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vp2s5" Mar 19 09:25:03.955733 master-0 kubenswrapper[15202]: I0319 09:25:03.955631 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-bvdqs" Mar 19 09:25:03.974809 master-0 kubenswrapper[15202]: I0319 09:25:03.974614 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:25:03.995874 master-0 kubenswrapper[15202]: I0319 09:25:03.995799 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-r6z7f" Mar 19 09:25:04.014999 master-0 kubenswrapper[15202]: I0319 09:25:04.014900 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:25:04.019319 master-0 kubenswrapper[15202]: I0319 09:25:04.019249 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:04.035677 master-0 kubenswrapper[15202]: I0319 09:25:04.035447 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:25:04.037111 master-0 kubenswrapper[15202]: I0319 09:25:04.037013 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb70a30-a8d1-4037-81e6-eb4f0510a234-serving-cert\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:04.054201 master-0 kubenswrapper[15202]: I0319 09:25:04.054077 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:25:04.084624 master-0 kubenswrapper[15202]: I0319 09:25:04.084517 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:25:04.087216 master-0 kubenswrapper[15202]: I0319 09:25:04.087135 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb70a30-a8d1-4037-81e6-eb4f0510a234-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:04.094929 master-0 kubenswrapper[15202]: I0319 09:25:04.094866 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xvfqf" Mar 19 09:25:04.096742 master-0 kubenswrapper[15202]: E0319 09:25:04.096656 15202 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.096882 master-0 kubenswrapper[15202]: E0319 09:25:04.096842 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls podName:cd42096c-f18d-4bb5-8a51-8761dc1edb73 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.596809168 +0000 UTC m=+21.982223994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-nm9nx" (UID: "cd42096c-f18d-4bb5-8a51-8761dc1edb73") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.099141 master-0 kubenswrapper[15202]: E0319 09:25:04.099084 15202 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.099209 master-0 kubenswrapper[15202]: E0319 09:25:04.099086 15202 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.099272 master-0 kubenswrapper[15202]: E0319 09:25:04.099195 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images podName:cd42096c-f18d-4bb5-8a51-8761dc1edb73 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.599170592 +0000 UTC m=+21.984585448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images") pod "cluster-baremetal-operator-6f69995874-nm9nx" (UID: "cd42096c-f18d-4bb5-8a51-8761dc1edb73") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.099321 master-0 kubenswrapper[15202]: E0319 09:25:04.099294 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images podName:f93b8728-4a33-4ee4-b7c6-cff7d7995953 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.599266644 +0000 UTC m=+21.984681470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images") pod "machine-api-operator-6fbb6cf6f9-qx75g" (UID: "f93b8728-4a33-4ee4-b7c6-cff7d7995953") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.101392 master-0 kubenswrapper[15202]: E0319 09:25:04.101328 15202 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.101513 master-0 kubenswrapper[15202]: E0319 09:25:04.101422 15202 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.101513 master-0 kubenswrapper[15202]: E0319 09:25:04.101448 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls podName:f93b8728-4a33-4ee4-b7c6-cff7d7995953 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.601420394 +0000 UTC m=+21.986835260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls") pod "machine-api-operator-6fbb6cf6f9-qx75g" (UID: "f93b8728-4a33-4ee4-b7c6-cff7d7995953") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.101637 master-0 kubenswrapper[15202]: E0319 09:25:04.101504 15202 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.101637 master-0 kubenswrapper[15202]: E0319 09:25:04.101556 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls podName:141cb120-92da-4d8d-bc29-fc4c433a6336 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.601530446 +0000 UTC m=+21.986945312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls") pod "cluster-samples-operator-85f7577d78-mfxr5" (UID: "141cb120-92da-4d8d-bc29-fc4c433a6336") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.101637 master-0 kubenswrapper[15202]: E0319 09:25:04.101602 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.601574547 +0000 UTC m=+21.986989403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.102757 master-0 kubenswrapper[15202]: E0319 09:25:04.102703 15202 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.102835 master-0 kubenswrapper[15202]: E0319 09:25:04.102817 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert podName:f0d16aa2-494d-4a65-880d-3d87219178b5 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.602789966 +0000 UTC m=+21.988204822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert") pod "packageserver-65cccc5599-mhl2j" (UID: "f0d16aa2-494d-4a65-880d-3d87219178b5") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.102883 master-0 kubenswrapper[15202]: E0319 09:25:04.102852 15202 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.102959 master-0 kubenswrapper[15202]: E0319 09:25:04.102923 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert podName:d32541c9-eef6-417c-9f5a-a7392dc70aa0 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.602905508 +0000 UTC m=+21.988320374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert") pod "cluster-autoscaler-operator-866dc4744-hzrg4" (UID: "d32541c9-eef6-417c-9f5a-a7392dc70aa0") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.103930 master-0 kubenswrapper[15202]: E0319 09:25:04.103887 15202 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.104010 master-0 kubenswrapper[15202]: E0319 09:25:04.103943 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca podName:c2a16f6f-437c-4da5-a797-287e5e1ddbd4 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.603931332 +0000 UTC m=+21.989346158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca") pod "cloud-credential-operator-744f9dbf77-s7ts2" (UID: "c2a16f6f-437c-4da5-a797-287e5e1ddbd4") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.104010 master-0 kubenswrapper[15202]: E0319 09:25:04.103978 15202 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.104098 master-0 kubenswrapper[15202]: E0319 09:25:04.104004 15202 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.104098 master-0 kubenswrapper[15202]: E0319 09:25:04.104010 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config podName:d32541c9-eef6-417c-9f5a-a7392dc70aa0 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.604003344 +0000 UTC m=+21.989418170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config") pod "cluster-autoscaler-operator-866dc4744-hzrg4" (UID: "d32541c9-eef6-417c-9f5a-a7392dc70aa0") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.104172 master-0 kubenswrapper[15202]: E0319 09:25:04.104110 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config podName:467c2f01-2c23-41e2-acb9-08a84061fefc nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.604088096 +0000 UTC m=+21.989502962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config") pod "machine-config-daemon-hgc52" (UID: "467c2f01-2c23-41e2-acb9-08a84061fefc") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.109632 master-0 kubenswrapper[15202]: E0319 09:25:04.109584 15202 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.109701 master-0 kubenswrapper[15202]: E0319 09:25:04.109678 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config podName:9ca444a4-4d78-456f-9656-0c28076ce77e nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.609660984 +0000 UTC m=+21.995075820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config") pod "machine-config-operator-84d549f6d5-fdwf5" (UID: "9ca444a4-4d78-456f-9656-0c28076ce77e") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.111843 master-0 kubenswrapper[15202]: E0319 09:25:04.111789 15202 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.111911 master-0 kubenswrapper[15202]: E0319 09:25:04.111872 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config podName:cd42096c-f18d-4bb5-8a51-8761dc1edb73 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.611856204 +0000 UTC m=+21.997271030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config") pod "cluster-baremetal-operator-6f69995874-nm9nx" (UID: "cd42096c-f18d-4bb5-8a51-8761dc1edb73") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.111972 master-0 kubenswrapper[15202]: E0319 09:25:04.111899 15202 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112072 master-0 kubenswrapper[15202]: E0319 09:25:04.111907 15202 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112125 master-0 kubenswrapper[15202]: E0319 09:25:04.111911 15202 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112215 master-0 kubenswrapper[15202]: E0319 09:25:04.112021 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls podName:467c2f01-2c23-41e2-acb9-08a84061fefc nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.611993947 +0000 UTC m=+21.997408803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls") pod "machine-config-daemon-hgc52" (UID: "467c2f01-2c23-41e2-acb9-08a84061fefc") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112264 master-0 kubenswrapper[15202]: E0319 09:25:04.112248 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert podName:cd42096c-f18d-4bb5-8a51-8761dc1edb73 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.612219933 +0000 UTC m=+21.997634789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert") pod "cluster-baremetal-operator-6f69995874-nm9nx" (UID: "cd42096c-f18d-4bb5-8a51-8761dc1edb73") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112308 master-0 kubenswrapper[15202]: E0319 09:25:04.112284 15202 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112389 master-0 kubenswrapper[15202]: E0319 09:25:04.112290 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert podName:c2a16f6f-437c-4da5-a797-287e5e1ddbd4 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.612270454 +0000 UTC m=+21.997685310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-744f9dbf77-s7ts2" (UID: "c2a16f6f-437c-4da5-a797-287e5e1ddbd4") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.112461 master-0 kubenswrapper[15202]: E0319 09:25:04.112413 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.612389487 +0000 UTC m=+21.997804343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.113076 master-0 kubenswrapper[15202]: E0319 09:25:04.113022 15202 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.113170 master-0 kubenswrapper[15202]: E0319 09:25:04.113112 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config podName:dea35f60-33be-4ccc-b985-952eac3a85c0 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.613099802 +0000 UTC m=+21.998514628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config") pod "machine-approver-5c6485487f-cscz5" (UID: "dea35f60-33be-4ccc-b985-952eac3a85c0") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.114081 master-0 kubenswrapper[15202]: I0319 09:25:04.114030 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:25:04.114591 master-0 kubenswrapper[15202]: E0319 09:25:04.114531 15202 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.114665 master-0 kubenswrapper[15202]: E0319 09:25:04.114633 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls podName:9ca444a4-4d78-456f-9656-0c28076ce77e nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.614612037 +0000 UTC m=+22.000026893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls") pod "machine-config-operator-84d549f6d5-fdwf5" (UID: "9ca444a4-4d78-456f-9656-0c28076ce77e") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.114744 master-0 kubenswrapper[15202]: E0319 09:25:04.114699 15202 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.114866 master-0 kubenswrapper[15202]: E0319 09:25:04.114712 15202 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.114919 master-0 kubenswrapper[15202]: E0319 09:25:04.114835 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config podName:f93b8728-4a33-4ee4-b7c6-cff7d7995953 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.614803391 +0000 UTC m=+22.000218257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config") pod "machine-api-operator-6fbb6cf6f9-qx75g" (UID: "f93b8728-4a33-4ee4-b7c6-cff7d7995953") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.114919 master-0 kubenswrapper[15202]: E0319 09:25:04.114913 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images podName:9ca444a4-4d78-456f-9656-0c28076ce77e nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.614893833 +0000 UTC m=+22.000308669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images") pod "machine-config-operator-84d549f6d5-fdwf5" (UID: "9ca444a4-4d78-456f-9656-0c28076ce77e") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:25:04.116013 master-0 kubenswrapper[15202]: E0319 09:25:04.115981 15202 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.116090 master-0 kubenswrapper[15202]: E0319 09:25:04.116073 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert podName:f0d16aa2-494d-4a65-880d-3d87219178b5 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.61605898 +0000 UTC m=+22.001473806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert") pod "packageserver-65cccc5599-mhl2j" (UID: "f0d16aa2-494d-4a65-880d-3d87219178b5") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.117325 master-0 kubenswrapper[15202]: E0319 09:25:04.117200 15202 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.117325 master-0 kubenswrapper[15202]: E0319 09:25:04.117253 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert podName:31742478-0d83-48cf-b38b-02416d95d4a8 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.617241838 +0000 UTC m=+22.002656664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-7d87854d6-g96tv" (UID: "31742478-0d83-48cf-b38b-02416d95d4a8") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:25:04.133804 master-0 kubenswrapper[15202]: I0319 09:25:04.133714 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:25:04.154730 master-0 kubenswrapper[15202]: I0319 09:25:04.154625 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:25:04.174434 master-0 kubenswrapper[15202]: I0319 09:25:04.174340 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zxmm6" Mar 19 09:25:04.193957 master-0 kubenswrapper[15202]: I0319 09:25:04.192948 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 09:25:04.195645 master-0 kubenswrapper[15202]: I0319 09:25:04.195592 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:25:04.214270 master-0 kubenswrapper[15202]: I0319 09:25:04.214202 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:25:04.235672 master-0 kubenswrapper[15202]: I0319 09:25:04.235610 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:25:04.254177 master-0 kubenswrapper[15202]: I0319 09:25:04.254096 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:25:04.273930 master-0 kubenswrapper[15202]: I0319 09:25:04.273839 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:25:04.295545 master-0 kubenswrapper[15202]: I0319 09:25:04.295330 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:25:04.315356 master-0 kubenswrapper[15202]: I0319 09:25:04.315281 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-mdr74" Mar 19 09:25:04.341887 master-0 kubenswrapper[15202]: I0319 09:25:04.341792 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:25:04.354826 master-0 kubenswrapper[15202]: I0319 09:25:04.354771 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:25:04.375444 master-0 kubenswrapper[15202]: I0319 09:25:04.375358 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-llwk7" Mar 19 09:25:04.394924 master-0 kubenswrapper[15202]: I0319 09:25:04.394853 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:25:04.415402 master-0 kubenswrapper[15202]: I0319 09:25:04.415310 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:25:04.435682 master-0 kubenswrapper[15202]: I0319 09:25:04.435602 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xcbjl" Mar 19 09:25:04.454650 master-0 kubenswrapper[15202]: I0319 09:25:04.454566 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:25:04.475196 master-0 kubenswrapper[15202]: I0319 09:25:04.475143 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:25:04.495207 master-0 kubenswrapper[15202]: I0319 09:25:04.495088 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:25:04.516615 master-0 kubenswrapper[15202]: I0319 09:25:04.514086 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:25:04.537845 master-0 kubenswrapper[15202]: I0319 09:25:04.533943 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6s584" Mar 19 09:25:04.554658 master-0 kubenswrapper[15202]: I0319 09:25:04.554553 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-jtdpn" Mar 19 09:25:04.574746 master-0 kubenswrapper[15202]: I0319 09:25:04.574685 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:25:04.595177 master-0 kubenswrapper[15202]: I0319 09:25:04.594963 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:25:04.613886 master-0 kubenswrapper[15202]: I0319 09:25:04.613838 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wkbj2" Mar 19 09:25:04.634065 master-0 kubenswrapper[15202]: I0319 09:25:04.634021 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:25:04.653954 master-0 kubenswrapper[15202]: I0319 09:25:04.653890 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-pr68p" Mar 19 09:25:04.656863 master-0 kubenswrapper[15202]: I0319 09:25:04.656812 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:04.657057 master-0 kubenswrapper[15202]: I0319 09:25:04.656892 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:25:04.657057 master-0 kubenswrapper[15202]: I0319 09:25:04.656943 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:04.657057 master-0 kubenswrapper[15202]: I0319 09:25:04.657030 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:04.657191 master-0 kubenswrapper[15202]: I0319 09:25:04.657081 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:04.657315 master-0 kubenswrapper[15202]: I0319 09:25:04.657287 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:04.657369 master-0 kubenswrapper[15202]: I0319 09:25:04.657328 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:04.657411 master-0 kubenswrapper[15202]: I0319 09:25:04.657360 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/141cb120-92da-4d8d-bc29-fc4c433a6336-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:25:04.657411 master-0 kubenswrapper[15202]: I0319 09:25:04.657377 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:04.657554 master-0 kubenswrapper[15202]: I0319 09:25:04.657444 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d32541c9-eef6-417c-9f5a-a7392dc70aa0-cert\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:04.657647 master-0 kubenswrapper[15202]: I0319 09:25:04.657624 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d32541c9-eef6-417c-9f5a-a7392dc70aa0-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:04.657710 master-0 kubenswrapper[15202]: I0319 09:25:04.657688 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:04.657784 master-0 kubenswrapper[15202]: I0319 09:25:04.657758 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ca444a4-4d78-456f-9656-0c28076ce77e-proxy-tls\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:04.657831 master-0 kubenswrapper[15202]: I0319 09:25:04.657768 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:04.657831 master-0 kubenswrapper[15202]: I0319 09:25:04.657771 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-apiservice-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:04.657926 master-0 kubenswrapper[15202]: I0319 09:25:04.657857 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:04.657926 master-0 kubenswrapper[15202]: I0319 09:25:04.657900 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:04.657926 master-0 kubenswrapper[15202]: I0319 09:25:04.657922 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:04.658037 master-0 kubenswrapper[15202]: E0319 09:25:04.657981 15202 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:04.658152 master-0 kubenswrapper[15202]: E0319 09:25:04.658127 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:06.658029671 +0000 UTC m=+24.043444697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:04.658215 master-0 kubenswrapper[15202]: I0319 09:25:04.658161 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:04.658215 master-0 kubenswrapper[15202]: I0319 09:25:04.658192 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:04.658300 master-0 kubenswrapper[15202]: I0319 09:25:04.658224 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:04.658300 master-0 kubenswrapper[15202]: I0319 09:25:04.658238 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-images\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:04.658300 master-0 kubenswrapper[15202]: I0319 09:25:04.658258 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.658300 master-0 kubenswrapper[15202]: I0319 09:25:04.658290 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:04.658453 master-0 kubenswrapper[15202]: E0319 09:25:04.658324 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:04.658453 master-0 kubenswrapper[15202]: I0319 09:25:04.658357 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:04.658453 master-0 kubenswrapper[15202]: E0319 09:25:04.658367 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:06.65835677 +0000 UTC m=+24.043771776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:04.658453 master-0 kubenswrapper[15202]: I0319 09:25:04.658433 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.658631 master-0 kubenswrapper[15202]: I0319 09:25:04.658551 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:04.658631 master-0 kubenswrapper[15202]: I0319 09:25:04.658605 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-config\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.658701 master-0 kubenswrapper[15202]: I0319 09:25:04.658634 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cert\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.658817 master-0 kubenswrapper[15202]: I0319 09:25:04.658788 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:04.658899 master-0 kubenswrapper[15202]: E0319 09:25:04.658821 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:04.658899 master-0 kubenswrapper[15202]: I0319 09:25:04.658833 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:04.658899 master-0 kubenswrapper[15202]: E0319 09:25:04.658850 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:06.658842441 +0000 UTC m=+24.044257257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:04.658899 master-0 kubenswrapper[15202]: I0319 09:25:04.658790 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ca444a4-4d78-456f-9656-0c28076ce77e-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:04.659043 master-0 kubenswrapper[15202]: I0319 09:25:04.658910 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:04.659043 master-0 kubenswrapper[15202]: I0319 09:25:04.658959 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:25:04.659043 master-0 kubenswrapper[15202]: I0319 09:25:04.659025 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.659201 master-0 kubenswrapper[15202]: I0319 09:25:04.659119 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.659201 master-0 kubenswrapper[15202]: I0319 09:25:04.659138 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0d16aa2-494d-4a65-880d-3d87219178b5-webhook-cert\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:04.659201 master-0 kubenswrapper[15202]: I0319 09:25:04.659193 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:04.659322 master-0 kubenswrapper[15202]: I0319 09:25:04.659263 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/31742478-0d83-48cf-b38b-02416d95d4a8-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:25:04.659322 master-0 kubenswrapper[15202]: I0319 09:25:04.659308 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cd42096c-f18d-4bb5-8a51-8761dc1edb73-images\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.659445 master-0 kubenswrapper[15202]: I0319 09:25:04.659424 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd42096c-f18d-4bb5-8a51-8761dc1edb73-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:04.673283 master-0 kubenswrapper[15202]: I0319 09:25:04.673216 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-t6dfg" Mar 19 09:25:04.693540 master-0 kubenswrapper[15202]: I0319 09:25:04.693494 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:25:04.700218 master-0 kubenswrapper[15202]: I0319 09:25:04.700170 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-config\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:04.713403 master-0 kubenswrapper[15202]: I0319 09:25:04.713352 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:25:04.717612 master-0 kubenswrapper[15202]: I0319 09:25:04.717569 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f93b8728-4a33-4ee4-b7c6-cff7d7995953-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:04.733778 master-0 kubenswrapper[15202]: I0319 09:25:04.733662 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:25:04.740367 master-0 kubenswrapper[15202]: I0319 09:25:04.740313 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93b8728-4a33-4ee4-b7c6-cff7d7995953-images\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:04.753793 master-0 kubenswrapper[15202]: I0319 09:25:04.753745 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:25:04.759127 master-0 kubenswrapper[15202]: I0319 09:25:04.759069 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/467c2f01-2c23-41e2-acb9-08a84061fefc-proxy-tls\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:04.783803 master-0 kubenswrapper[15202]: I0319 09:25:04.783739 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/467c2f01-2c23-41e2-acb9-08a84061fefc-mcd-auth-proxy-config\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:04.787733 master-0 kubenswrapper[15202]: I0319 09:25:04.787690 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-j66zv" Mar 19 09:25:04.793711 master-0 kubenswrapper[15202]: I0319 09:25:04.793657 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:25:04.814014 master-0 kubenswrapper[15202]: I0319 09:25:04.813806 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:25:04.834042 master-0 kubenswrapper[15202]: I0319 09:25:04.833978 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-l9t78" Mar 19 09:25:04.852049 master-0 kubenswrapper[15202]: I0319 09:25:04.851992 15202 request.go:700] Waited for 2.020729908s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/secrets?fieldSelector=metadata.name%3Dmachine-approver-tls&limit=500&resourceVersion=0 Mar 19 09:25:04.853120 master-0 kubenswrapper[15202]: I0319 09:25:04.853083 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:25:04.858944 master-0 kubenswrapper[15202]: I0319 09:25:04.858891 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea35f60-33be-4ccc-b985-952eac3a85c0-machine-approver-tls\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:04.873916 master-0 kubenswrapper[15202]: I0319 09:25:04.873773 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:25:04.877674 master-0 kubenswrapper[15202]: I0319 09:25:04.877617 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:04.894494 master-0 kubenswrapper[15202]: I0319 09:25:04.894389 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:25:04.899039 master-0 kubenswrapper[15202]: I0319 09:25:04.898964 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea35f60-33be-4ccc-b985-952eac3a85c0-auth-proxy-config\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:04.992394 master-0 kubenswrapper[15202]: E0319 09:25:04.992346 15202 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:25:05.852801 master-0 kubenswrapper[15202]: I0319 09:25:05.852690 15202 request.go:700] Waited for 2.741783781s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/serviceaccounts/ingress-operator/token Mar 19 09:25:06.000943 master-0 kubenswrapper[15202]: I0319 09:25:06.000888 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5cd\" (UniqueName: \"kubernetes.io/projected/bec90db1-02e3-4211-8c33-f8bcc304e3a7-kube-api-access-nr5cd\") pod \"iptables-alerter-2s58d\" (UID: \"bec90db1-02e3-4211-8c33-f8bcc304e3a7\") " pod="openshift-network-operator/iptables-alerter-2s58d" Mar 19 09:25:06.001386 master-0 kubenswrapper[15202]: I0319 09:25:06.001327 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7x89\" (UniqueName: \"kubernetes.io/projected/0cb70a30-a8d1-4037-81e6-eb4f0510a234-kube-api-access-q7x89\") pod \"insights-operator-68bf6ff9d6-wshz8\" (UID: \"0cb70a30-a8d1-4037-81e6-eb4f0510a234\") " pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" Mar 19 09:25:06.002261 master-0 kubenswrapper[15202]: I0319 09:25:06.002208 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwd5\" (UniqueName: \"kubernetes.io/projected/083882c0-ea2f-4405-8cf1-cce5b91fe602-kube-api-access-mlwd5\") pod \"openshift-controller-manager-operator-8c94f4649-xhzf9\" (UID: \"083882c0-ea2f-4405-8cf1-cce5b91fe602\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-xhzf9" Mar 19 09:25:06.004224 master-0 kubenswrapper[15202]: I0319 09:25:06.004185 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp46p\" (UniqueName: \"kubernetes.io/projected/96902651-8e2b-44c2-be80-0a8c7c28cb58-kube-api-access-fp46p\") pod \"ovnkube-node-fwjzr\" (UID: \"96902651-8e2b-44c2-be80-0a8c7c28cb58\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:06.005050 master-0 kubenswrapper[15202]: I0319 09:25:06.004989 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxtcq\" (UniqueName: \"kubernetes.io/projected/467c2f01-2c23-41e2-acb9-08a84061fefc-kube-api-access-mxtcq\") pod \"machine-config-daemon-hgc52\" (UID: \"467c2f01-2c23-41e2-acb9-08a84061fefc\") " pod="openshift-machine-config-operator/machine-config-daemon-hgc52" Mar 19 09:25:06.010130 master-0 kubenswrapper[15202]: I0319 09:25:06.010080 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lktk8\" (UniqueName: \"kubernetes.io/projected/208939f5-8fca-4fd5-b0c6-43484b7d1e30-kube-api-access-lktk8\") pod \"catalog-operator-68f85b4d6c-j92kd\" (UID: \"208939f5-8fca-4fd5-b0c6-43484b7d1e30\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:25:06.011042 master-0 kubenswrapper[15202]: I0319 09:25:06.010997 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xl5z\" (UniqueName: \"kubernetes.io/projected/f8fdab32-4e61-4e9c-a506-52121f625669-kube-api-access-5xl5z\") pod \"multus-admission-controller-58c9f8fc64-cr9pg\" (UID: \"f8fdab32-4e61-4e9c-a506-52121f625669\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-cr9pg" Mar 19 09:25:06.011174 master-0 kubenswrapper[15202]: I0319 09:25:06.011131 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz7d6\" (UniqueName: \"kubernetes.io/projected/31742478-0d83-48cf-b38b-02416d95d4a8-kube-api-access-wz7d6\") pod \"cluster-storage-operator-7d87854d6-g96tv\" (UID: \"31742478-0d83-48cf-b38b-02416d95d4a8\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" Mar 19 09:25:06.012767 master-0 kubenswrapper[15202]: I0319 09:25:06.012734 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rltcj\" (UniqueName: \"kubernetes.io/projected/39bf78ac-304b-4b82-8729-d184657ef3bb-kube-api-access-rltcj\") pod \"redhat-marketplace-wzz6n\" (UID: \"39bf78ac-304b-4b82-8729-d184657ef3bb\") " pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:06.014328 master-0 kubenswrapper[15202]: I0319 09:25:06.014298 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6zkv\" (UniqueName: \"kubernetes.io/projected/9663cc40-a69d-42ba-890e-071cb85062f5-kube-api-access-n6zkv\") pod \"etcd-operator-8544cbcf9c-ct498\" (UID: \"9663cc40-a69d-42ba-890e-071cb85062f5\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-ct498" Mar 19 09:25:06.015009 master-0 kubenswrapper[15202]: I0319 09:25:06.014962 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzntq\" (UniqueName: \"kubernetes.io/projected/1dd59466-0133-41fe-a648-28db73aa861b-kube-api-access-gzntq\") pod \"catalogd-controller-manager-6864dc98f7-7wdws\" (UID: \"1dd59466-0133-41fe-a648-28db73aa861b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:06.017036 master-0 kubenswrapper[15202]: I0319 09:25:06.017003 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljb2\" (UniqueName: \"kubernetes.io/projected/d90f590a-6118-4769-b18f-fec67dd62c20-kube-api-access-nljb2\") pod \"service-ca-79bc6b8d76-xlhg9\" (UID: \"d90f590a-6118-4769-b18f-fec67dd62c20\") " pod="openshift-service-ca/service-ca-79bc6b8d76-xlhg9" Mar 19 09:25:06.021151 master-0 kubenswrapper[15202]: I0319 09:25:06.021095 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvq8m\" (UniqueName: \"kubernetes.io/projected/ece5177b-ae15-4c33-a8d4-612ab50b2b8b-kube-api-access-pvq8m\") pod \"dns-operator-9c5679d8f-fdxtp\" (UID: \"ece5177b-ae15-4c33-a8d4-612ab50b2b8b\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-fdxtp" Mar 19 09:25:06.022925 master-0 kubenswrapper[15202]: I0319 09:25:06.022885 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548cd\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-kube-api-access-548cd\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:06.023641 master-0 kubenswrapper[15202]: I0319 09:25:06.023593 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dwx6\" (UniqueName: \"kubernetes.io/projected/db42b38e-294e-4016-8ac1-54126ac60de8-kube-api-access-8dwx6\") pod \"operator-controller-controller-manager-57777556ff-pn5gg\" (UID: \"db42b38e-294e-4016-8ac1-54126ac60de8\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:06.026206 master-0 kubenswrapper[15202]: I0319 09:25:06.026167 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcvr\" (UniqueName: \"kubernetes.io/projected/4256d841-23cb-4756-b827-f44ee6e54def-kube-api-access-ptcvr\") pod \"network-metrics-daemon-p76jz\" (UID: \"4256d841-23cb-4756-b827-f44ee6e54def\") " pod="openshift-multus/network-metrics-daemon-p76jz" Mar 19 09:25:06.026309 master-0 kubenswrapper[15202]: I0319 09:25:06.026225 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws5kr\" (UniqueName: \"kubernetes.io/projected/c2a16f6f-437c-4da5-a797-287e5e1ddbd4-kube-api-access-ws5kr\") pod \"cloud-credential-operator-744f9dbf77-s7ts2\" (UID: \"c2a16f6f-437c-4da5-a797-287e5e1ddbd4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" Mar 19 09:25:06.027376 master-0 kubenswrapper[15202]: I0319 09:25:06.027341 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvg4q\" (UniqueName: \"kubernetes.io/projected/d52fa1ad-0071-4506-bb94-e73d6f15a75c-kube-api-access-xvg4q\") pod \"node-resolver-pmxm8\" (UID: \"d52fa1ad-0071-4506-bb94-e73d6f15a75c\") " pod="openshift-dns/node-resolver-pmxm8" Mar 19 09:25:06.027803 master-0 kubenswrapper[15202]: I0319 09:25:06.027749 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdlx\" (UniqueName: \"kubernetes.io/projected/b8f39c16-3a94-45c3-a51c-f2e81eff967d-kube-api-access-qmdlx\") pod \"dns-default-p88qq\" (UID: \"b8f39c16-3a94-45c3-a51c-f2e81eff967d\") " pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:06.030135 master-0 kubenswrapper[15202]: I0319 09:25:06.030096 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgsm7\" (UniqueName: \"kubernetes.io/projected/e3376275-294d-446d-9b4c-930df60dba01-kube-api-access-cgsm7\") pod \"csi-snapshot-controller-64854d9cff-dzfgb\" (UID: \"e3376275-294d-446d-9b4c-930df60dba01\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" Mar 19 09:25:06.031189 master-0 kubenswrapper[15202]: I0319 09:25:06.031139 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8vk\" (UniqueName: \"kubernetes.io/projected/7ad3ef11-90df-40b1-acbf-ed9b0c708ddb-kube-api-access-qv8vk\") pod \"cluster-monitoring-operator-58845fbb57-z2869\" (UID: \"7ad3ef11-90df-40b1-acbf-ed9b0c708ddb\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-z2869" Mar 19 09:25:06.032188 master-0 kubenswrapper[15202]: I0319 09:25:06.032146 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrs54\" (UniqueName: \"kubernetes.io/projected/307605e6-d1cf-4172-8e7d-918c435f3577-kube-api-access-wrs54\") pod \"network-check-target-95w9b\" (UID: \"307605e6-d1cf-4172-8e7d-918c435f3577\") " pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:25:06.032188 master-0 kubenswrapper[15202]: I0319 09:25:06.032185 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfw5k\" (UniqueName: \"kubernetes.io/projected/f93b8728-4a33-4ee4-b7c6-cff7d7995953-kube-api-access-kfw5k\") pod \"machine-api-operator-6fbb6cf6f9-qx75g\" (UID: \"f93b8728-4a33-4ee4-b7c6-cff7d7995953\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" Mar 19 09:25:06.046790 master-0 kubenswrapper[15202]: I0319 09:25:06.035507 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvp9m\" (UniqueName: \"kubernetes.io/projected/d32541c9-eef6-417c-9f5a-a7392dc70aa0-kube-api-access-fvp9m\") pod \"cluster-autoscaler-operator-866dc4744-hzrg4\" (UID: \"d32541c9-eef6-417c-9f5a-a7392dc70aa0\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" Mar 19 09:25:06.049809 master-0 kubenswrapper[15202]: I0319 09:25:06.049311 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49lj\" (UniqueName: \"kubernetes.io/projected/e09725c2-45c6-4a60-b817-6e5316d6f8e8-kube-api-access-b49lj\") pod \"csi-snapshot-controller-operator-5f5d689c6b-dspnb\" (UID: \"e09725c2-45c6-4a60-b817-6e5316d6f8e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-dspnb" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.050194 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft9rs\" (UniqueName: \"kubernetes.io/projected/8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823-kube-api-access-ft9rs\") pod \"network-operator-7bd846bfc4-jxvxl\" (UID: \"8a6be5d9-c0d3-49c3-bb9a-4c8bec66b823\") " pod="openshift-network-operator/network-operator-7bd846bfc4-jxvxl" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.050374 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpv6\" (UniqueName: \"kubernetes.io/projected/872e5f8c-b014-4283-a4d2-0e2cfd29e192-kube-api-access-kfpv6\") pod \"multus-8svct\" (UID: \"872e5f8c-b014-4283-a4d2-0e2cfd29e192\") " pod="openshift-multus/multus-8svct" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.050767 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsdjh\" (UniqueName: \"kubernetes.io/projected/f0d16aa2-494d-4a65-880d-3d87219178b5-kube-api-access-fsdjh\") pod \"packageserver-65cccc5599-mhl2j\" (UID: \"f0d16aa2-494d-4a65-880d-3d87219178b5\") " pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.050850 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdb6\" (UniqueName: \"kubernetes.io/projected/cd42096c-f18d-4bb5-8a51-8761dc1edb73-kube-api-access-dxdb6\") pod \"cluster-baremetal-operator-6f69995874-nm9nx\" (UID: \"cd42096c-f18d-4bb5-8a51-8761dc1edb73\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.050978 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfsx\" (UniqueName: \"kubernetes.io/projected/7fda0d28-6511-4577-9cd3-58a9c1a64d4e-kube-api-access-rnfsx\") pod \"tuned-vkw4s\" (UID: \"7fda0d28-6511-4577-9cd3-58a9c1a64d4e\") " pod="openshift-cluster-node-tuning-operator/tuned-vkw4s" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.051142 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxz5\" (UniqueName: \"kubernetes.io/projected/33e92e5d-61ea-45b2-b357-ebffdaebf4af-kube-api-access-npxz5\") pod \"marketplace-operator-89ccd998f-6qck2\" (UID: \"33e92e5d-61ea-45b2-b357-ebffdaebf4af\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.051260 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvqh\" (UniqueName: \"kubernetes.io/projected/a75049de-dcf1-4102-b339-f45d5015adea-kube-api-access-4mvqh\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw\" (UID: \"a75049de-dcf1-4102-b339-f45d5015adea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-hn7cw" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.051298 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/62d3ca81-26e1-4625-a3aa-b1eabd31cfd6-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-6fzwb\" (UID: \"62d3ca81-26e1-4625-a3aa-b1eabd31cfd6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-6fzwb" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.051772 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc9945ac-4041-4120-b504-a173c2bf91bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-thkn2\" (UID: \"dc9945ac-4041-4120-b504-a173c2bf91bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-thkn2" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.052105 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a8e2194-aba6-4929-a29c-47c63c8ff799-bound-sa-token\") pod \"ingress-operator-66b84d69b-pgdrx\" (UID: \"6a8e2194-aba6-4929-a29c-47c63c8ff799\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.051989 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdmtg\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-kube-api-access-wdmtg\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.052164 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt22g\" (UniqueName: \"kubernetes.io/projected/9ca444a4-4d78-456f-9656-0c28076ce77e-kube-api-access-kt22g\") pod \"machine-config-operator-84d549f6d5-fdwf5\" (UID: \"9ca444a4-4d78-456f-9656-0c28076ce77e\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-fdwf5" Mar 19 09:25:06.053504 master-0 kubenswrapper[15202]: I0319 09:25:06.052930 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txp58\" (UniqueName: \"kubernetes.io/projected/dd69fc33-59d4-4538-b4ec-e2d08ac11f72-kube-api-access-txp58\") pod \"certified-operators-tkx45\" (UID: \"dd69fc33-59d4-4538-b4ec-e2d08ac11f72\") " pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:06.053982 master-0 kubenswrapper[15202]: I0319 09:25:06.053552 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxfs\" (UniqueName: \"kubernetes.io/projected/f1943401-a75b-4e45-8c65-3cc36018d8c4-kube-api-access-8cxfs\") pod \"redhat-operators-zpvpd\" (UID: \"f1943401-a75b-4e45-8c65-3cc36018d8c4\") " pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:06.053982 master-0 kubenswrapper[15202]: I0319 09:25:06.053611 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4rtm\" (UniqueName: \"kubernetes.io/projected/8aa0f17a-287e-4a19-9a59-4913e7707071-kube-api-access-m4rtm\") pod \"olm-operator-5c9796789-wjbt2\" (UID: \"8aa0f17a-287e-4a19-9a59-4913e7707071\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:25:06.057572 master-0 kubenswrapper[15202]: I0319 09:25:06.054836 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvxj\" (UniqueName: \"kubernetes.io/projected/357980ba-1957-412f-afb5-04281eca2bee-kube-api-access-8zvxj\") pod \"authentication-operator-5885bfd7f4-z8gbk\" (UID: \"357980ba-1957-412f-afb5-04281eca2bee\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-z8gbk" Mar 19 09:25:06.057572 master-0 kubenswrapper[15202]: I0319 09:25:06.055836 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8nz\" (UniqueName: \"kubernetes.io/projected/f216606b-43d0-43d0-a3e3-a3ee2952e7b8-kube-api-access-bd8nz\") pod \"cluster-olm-operator-67dcd4998-wrdwm\" (UID: \"f216606b-43d0-43d0-a3e3-a3ee2952e7b8\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-wrdwm" Mar 19 09:25:06.057769 master-0 kubenswrapper[15202]: I0319 09:25:06.057735 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a823c8bc-09ef-46a9-a1f3-155a34b89788-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-rcnnp\" (UID: \"a823c8bc-09ef-46a9-a1f3-155a34b89788\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-rcnnp" Mar 19 09:25:06.058280 master-0 kubenswrapper[15202]: I0319 09:25:06.058244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9zg8\" (UniqueName: \"kubernetes.io/projected/b2898746-6827-41d9-ac88-64206cb84ac9-kube-api-access-x9zg8\") pod \"network-node-identity-kqb2h\" (UID: \"b2898746-6827-41d9-ac88-64206cb84ac9\") " pod="openshift-network-node-identity/network-node-identity-kqb2h" Mar 19 09:25:06.058909 master-0 kubenswrapper[15202]: I0319 09:25:06.058834 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2ntw\" (UniqueName: \"kubernetes.io/projected/b2bff8a5-c45d-4d28-8771-2239ad0fa578-kube-api-access-s2ntw\") pod \"apiserver-6fccf84fc5-rnmt2\" (UID: \"b2bff8a5-c45d-4d28-8771-2239ad0fa578\") " pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:06.060679 master-0 kubenswrapper[15202]: I0319 09:25:06.060643 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9dj\" (UniqueName: \"kubernetes.io/projected/3a4fd337-c385-4f56-965c-d68ee0a4e848-kube-api-access-vr9dj\") pod \"apiserver-54cd8888b9-q4ztg\" (UID: \"3a4fd337-c385-4f56-965c-d68ee0a4e848\") " pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:06.060852 master-0 kubenswrapper[15202]: I0319 09:25:06.060815 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/310d604b-fe9a-4b19-b8b5-7a1983e45e67-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-tvm5p\" (UID: \"310d604b-fe9a-4b19-b8b5-7a1983e45e67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-tvm5p" Mar 19 09:25:06.127952 master-0 kubenswrapper[15202]: I0319 09:25:06.127525 15202 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:25:06.707262 master-0 kubenswrapper[15202]: I0319 09:25:06.707178 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:06.707608 master-0 kubenswrapper[15202]: E0319 09:25:06.707481 15202 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:06.707691 master-0 kubenswrapper[15202]: E0319 09:25:06.707629 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:06.707691 master-0 kubenswrapper[15202]: I0319 09:25:06.707521 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:06.707748 master-0 kubenswrapper[15202]: E0319 09:25:06.707656 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:10.707625404 +0000 UTC m=+28.093040220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:06.707871 master-0 kubenswrapper[15202]: E0319 09:25:06.707832 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:10.707806758 +0000 UTC m=+28.093221744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:06.707943 master-0 kubenswrapper[15202]: I0319 09:25:06.707916 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:06.708066 master-0 kubenswrapper[15202]: E0319 09:25:06.708037 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:06.708113 master-0 kubenswrapper[15202]: E0319 09:25:06.708085 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:10.708077655 +0000 UTC m=+28.093492471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:07.133903 master-0 kubenswrapper[15202]: I0319 09:25:07.133812 15202 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" start-of-body= Mar 19 09:25:07.133903 master-0 kubenswrapper[15202]: I0319 09:25:07.133901 15202 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" Mar 19 09:25:07.247642 master-0 kubenswrapper[15202]: I0319 09:25:07.221089 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-check-endpoints/0.log" Mar 19 09:25:07.247642 master-0 kubenswrapper[15202]: I0319 09:25:07.222684 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="21a9ca68aca58418f611d967784b8b2e15b3acfa4bde8394a7537d1e53b9f6af" exitCode=255 Mar 19 09:25:07.683590 master-0 kubenswrapper[15202]: I0319 09:25:07.681268 15202 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 09:25:07.683590 master-0 kubenswrapper[15202]: I0319 09:25:07.681355 15202 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 09:25:07.696884 master-0 kubenswrapper[15202]: E0319 09:25:07.690925 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-root-ca.crt: object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered Mar 19 09:25:07.696884 master-0 kubenswrapper[15202]: E0319 09:25:07.690959 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/openshift-service-ca.crt: object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered Mar 19 09:25:07.696884 master-0 kubenswrapper[15202]: E0319 09:25:07.690975 15202 projected.go:194] Error preparing data for projected volume kube-api-access-w6qs5 for pod openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp: [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:07.696884 master-0 kubenswrapper[15202]: E0319 09:25:07.691045 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5 podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:08.191025244 +0000 UTC m=+25.576440060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w6qs5" (UniqueName: "kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:07.696884 master-0 kubenswrapper[15202]: I0319 09:25:07.692166 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhwd7\" (UniqueName: \"kubernetes.io/projected/141cb120-92da-4d8d-bc29-fc4c433a6336-kube-api-access-fhwd7\") pod \"cluster-samples-operator-85f7577d78-mfxr5\" (UID: \"141cb120-92da-4d8d-bc29-fc4c433a6336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" Mar 19 09:25:07.696884 master-0 kubenswrapper[15202]: I0319 09:25:07.696130 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n49x9\" (UniqueName: \"kubernetes.io/projected/89b0e82c-1cd1-45aa-9cab-2d11320a1ff7-kube-api-access-n49x9\") pod \"community-operators-wqngb\" (UID: \"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7\") " pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:07.697627 master-0 kubenswrapper[15202]: I0319 09:25:07.697579 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn48v\" (UniqueName: \"kubernetes.io/projected/86c4b0e4-3481-465d-b00f-022d2c58c183-kube-api-access-qn48v\") pod \"openshift-apiserver-operator-d65958b8-96qpx\" (UID: \"86c4b0e4-3481-465d-b00f-022d2c58c183\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-96qpx" Mar 19 09:25:07.699020 master-0 kubenswrapper[15202]: I0319 09:25:07.698987 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47czp\" (UniqueName: \"kubernetes.io/projected/1f2148fe-f9f6-47da-894c-b88dae360ebe-kube-api-access-47czp\") pod \"package-server-manager-7b95f86987-gltb5\" (UID: \"1f2148fe-f9f6-47da-894c-b88dae360ebe\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:25:07.704272 master-0 kubenswrapper[15202]: I0319 09:25:07.704237 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxpw\" (UniqueName: \"kubernetes.io/projected/e9ebcecb-c210-434e-83a1-825265e206f1-kube-api-access-txxpw\") pod \"multus-additional-cni-plugins-tjzdb\" (UID: \"e9ebcecb-c210-434e-83a1-825265e206f1\") " pod="openshift-multus/multus-additional-cni-plugins-tjzdb" Mar 19 09:25:07.704766 master-0 kubenswrapper[15202]: I0319 09:25:07.704740 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzdjs\" (UniqueName: \"kubernetes.io/projected/d486ce23-acf7-429a-9739-4770e1a2bf78-kube-api-access-bzdjs\") pod \"control-plane-machine-set-operator-6f97756bc8-l8kmn\" (UID: \"d486ce23-acf7-429a-9739-4770e1a2bf78\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" Mar 19 09:25:07.709613 master-0 kubenswrapper[15202]: E0319 09:25:07.709583 15202 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 09:25:07.709733 master-0 kubenswrapper[15202]: E0319 09:25:07.709720 15202 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 09:25:07.709868 master-0 kubenswrapper[15202]: E0319 09:25:07.709852 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access podName:89be0036-a2c8-48b4-9eaf-17fab972c4f4 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:08.209829206 +0000 UTC m=+25.595244222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access") pod "installer-2-master-0" (UID: "89be0036-a2c8-48b4-9eaf-17fab972c4f4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 19 09:25:07.710078 master-0 kubenswrapper[15202]: I0319 09:25:07.710040 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8wj\" (UniqueName: \"kubernetes.io/projected/5b36f3b2-caf9-40ad-a3a1-e83796142f54-kube-api-access-7k8wj\") pod \"service-ca-operator-b865698dc-wwkqz\" (UID: \"5b36f3b2-caf9-40ad-a3a1-e83796142f54\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-wwkqz" Mar 19 09:25:07.710316 master-0 kubenswrapper[15202]: I0319 09:25:07.710286 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") pod \"route-controller-manager-6ff75bdd67-drxcb\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:07.710628 master-0 kubenswrapper[15202]: I0319 09:25:07.710577 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmwd\" (UniqueName: \"kubernetes.io/projected/9ac42112-6a00-4c17-b230-75b565aa668f-kube-api-access-bgmwd\") pod \"cluster-node-tuning-operator-598fbc5f8f-wh9q6\" (UID: \"9ac42112-6a00-4c17-b230-75b565aa668f\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-wh9q6" Mar 19 09:25:07.710703 master-0 kubenswrapper[15202]: I0319 09:25:07.710604 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a417fe25-4aca-471c-941d-c195b6141042-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-dcmsc\" (UID: \"a417fe25-4aca-471c-941d-c195b6141042\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-dcmsc" Mar 19 09:25:07.711046 master-0 kubenswrapper[15202]: I0319 09:25:07.711026 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zz2n\" (UniqueName: \"kubernetes.io/projected/9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc-kube-api-access-2zz2n\") pod \"migrator-8487694857-nkvjk\" (UID: \"9d3a3480-9f1f-4dd1-b58d-9721e4f18fbc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-nkvjk" Mar 19 09:25:07.712626 master-0 kubenswrapper[15202]: I0319 09:25:07.712590 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4t8\" (UniqueName: \"kubernetes.io/projected/7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8-kube-api-access-qh4t8\") pod \"openshift-config-operator-95bf4f4d-bqqqq\" (UID: \"7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:07.713484 master-0 kubenswrapper[15202]: I0319 09:25:07.713362 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") pod \"controller-manager-6f9655dc5d-8lp25\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:07.713892 master-0 kubenswrapper[15202]: I0319 09:25:07.713843 15202 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:25:07.715627 master-0 kubenswrapper[15202]: I0319 09:25:07.715606 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2hg\" (UniqueName: \"kubernetes.io/projected/5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5-kube-api-access-4n2hg\") pod \"ovnkube-control-plane-57f769d897-r75tv\" (UID: \"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" Mar 19 09:25:07.716860 master-0 kubenswrapper[15202]: I0319 09:25:07.716785 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4wht\" (UniqueName: \"kubernetes.io/projected/dea35f60-33be-4ccc-b985-952eac3a85c0-kube-api-access-h4wht\") pod \"machine-approver-5c6485487f-cscz5\" (UID: \"dea35f60-33be-4ccc-b985-952eac3a85c0\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:07.722944 master-0 kubenswrapper[15202]: I0319 09:25:07.722899 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access\") pod \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\" (UID: \"89be0036-a2c8-48b4-9eaf-17fab972c4f4\") " Mar 19 09:25:07.726350 master-0 kubenswrapper[15202]: I0319 09:25:07.726308 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "89be0036-a2c8-48b4-9eaf-17fab972c4f4" (UID: "89be0036-a2c8-48b4-9eaf-17fab972c4f4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:25:07.824421 master-0 kubenswrapper[15202]: I0319 09:25:07.824375 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89be0036-a2c8-48b4-9eaf-17fab972c4f4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:07.932087 master-0 kubenswrapper[15202]: I0319 09:25:07.932055 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" Mar 19 09:25:07.950234 master-0 kubenswrapper[15202]: W0319 09:25:07.950200 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea35f60_33be_4ccc_b985_952eac3a85c0.slice/crio-c909ca6bf764e66006c10e8674c7f57b7f454d0130f3033d2a6640de9f9c5ee5 WatchSource:0}: Error finding container c909ca6bf764e66006c10e8674c7f57b7f454d0130f3033d2a6640de9f9c5ee5: Status 404 returned error can't find the container with id c909ca6bf764e66006c10e8674c7f57b7f454d0130f3033d2a6640de9f9c5ee5 Mar 19 09:25:08.234254 master-0 kubenswrapper[15202]: I0319 09:25:08.234182 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:08.234868 master-0 kubenswrapper[15202]: E0319 09:25:08.234511 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-root-ca.crt: object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered Mar 19 09:25:08.234868 master-0 kubenswrapper[15202]: E0319 09:25:08.234537 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/openshift-service-ca.crt: object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered Mar 19 09:25:08.234868 master-0 kubenswrapper[15202]: E0319 09:25:08.234552 15202 projected.go:194] Error preparing data for projected volume kube-api-access-w6qs5 for pod openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp: [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:08.234868 master-0 kubenswrapper[15202]: E0319 09:25:08.234632 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5 podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:09.234608892 +0000 UTC m=+26.620023718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6qs5" (UniqueName: "kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:08.241853 master-0 kubenswrapper[15202]: I0319 09:25:08.241809 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/1.log" Mar 19 09:25:08.242038 master-0 kubenswrapper[15202]: I0319 09:25:08.241860 15202 generic.go:334] "Generic (PLEG): container finished" podID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" containerID="98392883f5d13272d9f78e0701e65eccdb98b8b34059cfebbb6a7f273b5e159f" exitCode=1 Mar 19 09:25:08.243576 master-0 kubenswrapper[15202]: I0319 09:25:08.243528 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/1.log" Mar 19 09:25:08.244005 master-0 kubenswrapper[15202]: I0319 09:25:08.243960 15202 generic.go:334] "Generic (PLEG): container finished" podID="d32541c9-eef6-417c-9f5a-a7392dc70aa0" containerID="e883f3efaa74902ad874d396550b0cb01d1872a885c29b583da8ebef350866c4" exitCode=255 Mar 19 09:25:08.246452 master-0 kubenswrapper[15202]: I0319 09:25:08.246425 15202 generic.go:334] "Generic (PLEG): container finished" podID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" containerID="9eb1e9a14ebcec2e1a764be170f89e1ef614b1040813e25d65d1e6d7e567633c" exitCode=0 Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: E0319 09:25:08.283337 15202 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="5.453s" Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283414 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283444 15202 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="7f0ee125-e760-4bd1-a88b-8e71716de6b8" Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283500 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"89be0036-a2c8-48b4-9eaf-17fab972c4f4","Type":"ContainerDied","Data":"58d5b64552b14fa37f1c4ade1890dfcbcf78def52cdf495457e904377a1b0a43"} Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283540 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58d5b64552b14fa37f1c4ade1890dfcbcf78def52cdf495457e904377a1b0a43" Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283666 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283685 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:25:08.284971 master-0 kubenswrapper[15202]: I0319 09:25:08.283719 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:25:08.309001 master-0 kubenswrapper[15202]: I0319 09:25:08.308875 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 09:25:09.250612 master-0 kubenswrapper[15202]: I0319 09:25:09.249809 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:09.250612 master-0 kubenswrapper[15202]: E0319 09:25:09.250029 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-root-ca.crt: object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered Mar 19 09:25:09.250612 master-0 kubenswrapper[15202]: E0319 09:25:09.250066 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/openshift-service-ca.crt: object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered Mar 19 09:25:09.250612 master-0 kubenswrapper[15202]: E0319 09:25:09.250080 15202 projected.go:194] Error preparing data for projected volume kube-api-access-w6qs5 for pod openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp: [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:09.250612 master-0 kubenswrapper[15202]: E0319 09:25:09.250160 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5 podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:11.250141059 +0000 UTC m=+28.635555865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6qs5" (UniqueName: "kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:09.812553 master-0 kubenswrapper[15202]: E0319 09:25:09.812377 15202 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.529s" Mar 19 09:25:09.812553 master-0 kubenswrapper[15202]: I0319 09:25:09.812483 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:09.812553 master-0 kubenswrapper[15202]: I0319 09:25:09.812505 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" event={"ID":"d32541c9-eef6-417c-9f5a-a7392dc70aa0","Type":"ContainerStarted","Data":"e883f3efaa74902ad874d396550b0cb01d1872a885c29b583da8ebef350866c4"} Mar 19 09:25:09.812553 master-0 kubenswrapper[15202]: I0319 09:25:09.812520 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerStarted","Data":"9eb1e9a14ebcec2e1a764be170f89e1ef614b1040813e25d65d1e6d7e567633c"} Mar 19 09:25:09.813436 master-0 kubenswrapper[15202]: I0319 09:25:09.813259 15202 scope.go:117] "RemoveContainer" containerID="e883f3efaa74902ad874d396550b0cb01d1872a885c29b583da8ebef350866c4" Mar 19 09:25:09.814030 master-0 kubenswrapper[15202]: I0319 09:25:09.813983 15202 scope.go:117] "RemoveContainer" containerID="9eb1e9a14ebcec2e1a764be170f89e1ef614b1040813e25d65d1e6d7e567633c" Mar 19 09:25:09.819841 master-0 kubenswrapper[15202]: I0319 09:25:09.819784 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389158 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-g96tv" event={"ID":"31742478-0d83-48cf-b38b-02416d95d4a8","Type":"ContainerStarted","Data":"32732bd9177b1c80b75ce4a0158c67ae0077812f118c25d4dd1f725e90e3f730"} Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389263 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389370 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389386 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" event={"ID":"f93b8728-4a33-4ee4-b7c6-cff7d7995953","Type":"ContainerStarted","Data":"82c9db947e048d06ab325d28a9da9998cc2e567351adaf60dac34444baffd037"} Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389405 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389419 15202 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="7f0ee125-e760-4bd1-a88b-8e71716de6b8" Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389435 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-j92kd" Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389459 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:25:10.389550 master-0 kubenswrapper[15202]: I0319 09:25:10.389551 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389568 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerDied","Data":"21a9ca68aca58418f611d967784b8b2e15b3acfa4bde8394a7537d1e53b9f6af"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389591 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hgc52" event={"ID":"467c2f01-2c23-41e2-acb9-08a84061fefc","Type":"ContainerStarted","Data":"3ccdb054a266b39ca01d22d9910dc4b3b467a19fddce63262de61466e95bf713"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389610 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hgc52" event={"ID":"467c2f01-2c23-41e2-acb9-08a84061fefc","Type":"ContainerStarted","Data":"893fe2d6f53b397cdb5f5f6354b1f6dab3b484f576f4afd434ac82b34cb4ef76"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389638 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389695 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389709 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerStarted","Data":"fb65e57537c91c4e818bcaa291a7d70ac71b8cba3206885614909897a1a62d01"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389732 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-wjbt2" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389745 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerDied","Data":"98392883f5d13272d9f78e0701e65eccdb98b8b34059cfebbb6a7f273b5e159f"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389761 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" event={"ID":"d32541c9-eef6-417c-9f5a-a7392dc70aa0","Type":"ContainerDied","Data":"e883f3efaa74902ad874d396550b0cb01d1872a885c29b583da8ebef350866c4"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389806 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389820 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" event={"ID":"dea35f60-33be-4ccc-b985-952eac3a85c0","Type":"ContainerStarted","Data":"c909ca6bf764e66006c10e8674c7f57b7f454d0130f3033d2a6640de9f9c5ee5"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389835 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerDied","Data":"9eb1e9a14ebcec2e1a764be170f89e1ef614b1040813e25d65d1e6d7e567633c"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389869 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389883 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-s7ts2" event={"ID":"c2a16f6f-437c-4da5-a797-287e5e1ddbd4","Type":"ContainerStarted","Data":"d19e9b3bfe9954322bb4114d53b80ca66cfe5a40d8abfe3386ed4f1a98cee7b6"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389918 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" event={"ID":"dea35f60-33be-4ccc-b985-952eac3a85c0","Type":"ContainerStarted","Data":"3b137c5b2a6a1f401ae41e638270c23b2cf4e3abfb915b37d50e44387fad09f5"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389930 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" event={"ID":"141cb120-92da-4d8d-bc29-fc4c433a6336","Type":"ContainerStarted","Data":"c48f1ffb8a85bf2ac8f56d56c80ae7f7a50b3ceeb772007dd6221921110f148a"} Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.389973 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.390055 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-95w9b" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.390071 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.390109 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.390137 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.390167 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:10.390179 master-0 kubenswrapper[15202]: I0319 09:25:10.390200 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390301 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p88qq" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390414 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390522 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390592 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390601 15202 scope.go:117] "RemoveContainer" containerID="21a9ca68aca58418f611d967784b8b2e15b3acfa4bde8394a7537d1e53b9f6af" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390631 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-65cccc5599-mhl2j" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390688 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390717 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390739 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390760 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390780 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390799 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390822 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:25:10.392919 master-0 kubenswrapper[15202]: I0319 09:25:10.390841 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:25:10.399150 master-0 kubenswrapper[15202]: I0319 09:25:10.399118 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:25:10.401434 master-0 kubenswrapper[15202]: I0319 09:25:10.401351 15202 scope.go:117] "RemoveContainer" containerID="98392883f5d13272d9f78e0701e65eccdb98b8b34059cfebbb6a7f273b5e159f" Mar 19 09:25:10.410724 master-0 kubenswrapper[15202]: I0319 09:25:10.410689 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-bqqqq" Mar 19 09:25:10.410773 master-0 kubenswrapper[15202]: I0319 09:25:10.410752 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-gltb5" Mar 19 09:25:10.414103 master-0 kubenswrapper[15202]: I0319 09:25:10.414059 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:25:10.422959 master-0 kubenswrapper[15202]: I0319 09:25:10.422865 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:25:10.495829 master-0 kubenswrapper[15202]: I0319 09:25:10.495746 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=37.495727972 podStartE2EDuration="37.495727972s" podCreationTimestamp="2026-03-19 09:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:10.471201857 +0000 UTC m=+27.856616673" watchObservedRunningTime="2026-03-19 09:25:10.495727972 +0000 UTC m=+27.881142778" Mar 19 09:25:10.776067 master-0 kubenswrapper[15202]: I0319 09:25:10.776001 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:10.776223 master-0 kubenswrapper[15202]: I0319 09:25:10.776082 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:10.776223 master-0 kubenswrapper[15202]: I0319 09:25:10.776137 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:10.776390 master-0 kubenswrapper[15202]: E0319 09:25:10.776355 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:10.776448 master-0 kubenswrapper[15202]: E0319 09:25:10.776424 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:18.77640785 +0000 UTC m=+36.161822666 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" not registered Mar 19 09:25:10.776559 master-0 kubenswrapper[15202]: E0319 09:25:10.776528 15202 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:10.776606 master-0 kubenswrapper[15202]: E0319 09:25:10.776564 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:18.776554183 +0000 UTC m=+36.161968999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" not registered Mar 19 09:25:10.776606 master-0 kubenswrapper[15202]: E0319 09:25:10.776600 15202 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:10.776691 master-0 kubenswrapper[15202]: E0319 09:25:10.776627 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:18.776619155 +0000 UTC m=+36.162033971 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : object "openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" not registered Mar 19 09:25:11.162461 master-0 kubenswrapper[15202]: I0319 09:25:11.162406 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-54cd8888b9-q4ztg" Mar 19 09:25:11.163184 master-0 kubenswrapper[15202]: I0319 09:25:11.163134 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-6fccf84fc5-rnmt2" Mar 19 09:25:11.276997 master-0 kubenswrapper[15202]: I0319 09:25:11.276930 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" event={"ID":"dea35f60-33be-4ccc-b985-952eac3a85c0","Type":"ContainerStarted","Data":"41fff1387ec3d61c33f2a14d88308784724d79567cff9004f6cc0ae8d5850e73"} Mar 19 09:25:11.279678 master-0 kubenswrapper[15202]: I0319 09:25:11.279026 15202 generic.go:334] "Generic (PLEG): container finished" podID="dd69fc33-59d4-4538-b4ec-e2d08ac11f72" containerID="c0f8b1546d187df555bfe28571fad1a4d87b05e0591331f9a506ff6cf9b47942" exitCode=0 Mar 19 09:25:11.279678 master-0 kubenswrapper[15202]: I0319 09:25:11.279071 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx45" event={"ID":"dd69fc33-59d4-4538-b4ec-e2d08ac11f72","Type":"ContainerDied","Data":"c0f8b1546d187df555bfe28571fad1a4d87b05e0591331f9a506ff6cf9b47942"} Mar 19 09:25:11.281272 master-0 kubenswrapper[15202]: I0319 09:25:11.280914 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mfxr5" event={"ID":"141cb120-92da-4d8d-bc29-fc4c433a6336","Type":"ContainerStarted","Data":"fc32dc8ec50c8b1b1cd827d027cd8fe016626f6f3518e43bda0fad00b812bc46"} Mar 19 09:25:11.283848 master-0 kubenswrapper[15202]: I0319 09:25:11.283808 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:11.284047 master-0 kubenswrapper[15202]: E0319 09:25:11.283974 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-root-ca.crt: object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered Mar 19 09:25:11.284047 master-0 kubenswrapper[15202]: E0319 09:25:11.283991 15202 projected.go:288] Couldn't get configMap openshift-cloud-controller-manager-operator/openshift-service-ca.crt: object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered Mar 19 09:25:11.284047 master-0 kubenswrapper[15202]: E0319 09:25:11.284002 15202 projected.go:194] Error preparing data for projected volume kube-api-access-w6qs5 for pod openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp: [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:11.284167 master-0 kubenswrapper[15202]: E0319 09:25:11.284048 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5 podName:ce38ec35-8f00-4060-a620-1759a6bbef66 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:15.284033251 +0000 UTC m=+32.669448067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-w6qs5" (UniqueName: "kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5") pod "cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66") : [object "openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" not registered, object "openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" not registered] Mar 19 09:25:11.286635 master-0 kubenswrapper[15202]: I0319 09:25:11.286593 15202 generic.go:334] "Generic (PLEG): container finished" podID="39bf78ac-304b-4b82-8729-d184657ef3bb" containerID="a4448fa78dd0d37ca7c1e3ea3a9fe32b9196bbb2fe861662d94885bcb4824f30" exitCode=0 Mar 19 09:25:11.286711 master-0 kubenswrapper[15202]: I0319 09:25:11.286684 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzz6n" event={"ID":"39bf78ac-304b-4b82-8729-d184657ef3bb","Type":"ContainerDied","Data":"a4448fa78dd0d37ca7c1e3ea3a9fe32b9196bbb2fe861662d94885bcb4824f30"} Mar 19 09:25:11.292569 master-0 kubenswrapper[15202]: I0319 09:25:11.292527 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/2.log" Mar 19 09:25:11.294161 master-0 kubenswrapper[15202]: I0319 09:25:11.294131 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/1.log" Mar 19 09:25:11.294604 master-0 kubenswrapper[15202]: I0319 09:25:11.294565 15202 generic.go:334] "Generic (PLEG): container finished" podID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" containerID="d01c20a752c69ad4fdbf88d6635a40cc54a638ede023acdd8e476bc823088f26" exitCode=1 Mar 19 09:25:11.294670 master-0 kubenswrapper[15202]: I0319 09:25:11.294649 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerDied","Data":"d01c20a752c69ad4fdbf88d6635a40cc54a638ede023acdd8e476bc823088f26"} Mar 19 09:25:11.294705 master-0 kubenswrapper[15202]: I0319 09:25:11.294692 15202 scope.go:117] "RemoveContainer" containerID="98392883f5d13272d9f78e0701e65eccdb98b8b34059cfebbb6a7f273b5e159f" Mar 19 09:25:11.295218 master-0 kubenswrapper[15202]: I0319 09:25:11.295173 15202 scope.go:117] "RemoveContainer" containerID="d01c20a752c69ad4fdbf88d6635a40cc54a638ede023acdd8e476bc823088f26" Mar 19 09:25:11.295576 master-0 kubenswrapper[15202]: E0319 09:25:11.295546 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-nm9nx_openshift-machine-api(cd42096c-f18d-4bb5-8a51-8761dc1edb73)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" podUID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" Mar 19 09:25:11.298208 master-0 kubenswrapper[15202]: I0319 09:25:11.298176 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-check-endpoints/0.log" Mar 19 09:25:11.300396 master-0 kubenswrapper[15202]: I0319 09:25:11.300353 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7a1511182fa3564db9f50c25912cc22f","Type":"ContainerStarted","Data":"3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15"} Mar 19 09:25:11.300750 master-0 kubenswrapper[15202]: I0319 09:25:11.300697 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:25:11.305053 master-0 kubenswrapper[15202]: I0319 09:25:11.305027 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/2.log" Mar 19 09:25:11.305445 master-0 kubenswrapper[15202]: I0319 09:25:11.305430 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/1.log" Mar 19 09:25:11.305959 master-0 kubenswrapper[15202]: I0319 09:25:11.305939 15202 generic.go:334] "Generic (PLEG): container finished" podID="d32541c9-eef6-417c-9f5a-a7392dc70aa0" containerID="d04b5565f460914cbdf914be37a041423b677254e05486f3d50bf2995c7a798e" exitCode=255 Mar 19 09:25:11.306068 master-0 kubenswrapper[15202]: I0319 09:25:11.305965 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" event={"ID":"d32541c9-eef6-417c-9f5a-a7392dc70aa0","Type":"ContainerDied","Data":"d04b5565f460914cbdf914be37a041423b677254e05486f3d50bf2995c7a798e"} Mar 19 09:25:11.306343 master-0 kubenswrapper[15202]: I0319 09:25:11.306307 15202 scope.go:117] "RemoveContainer" containerID="d04b5565f460914cbdf914be37a041423b677254e05486f3d50bf2995c7a798e" Mar 19 09:25:11.306510 master-0 kubenswrapper[15202]: E0319 09:25:11.306484 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-866dc4744-hzrg4_openshift-machine-api(d32541c9-eef6-417c-9f5a-a7392dc70aa0)\"" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" podUID="d32541c9-eef6-417c-9f5a-a7392dc70aa0" Mar 19 09:25:11.308650 master-0 kubenswrapper[15202]: I0319 09:25:11.308597 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpvpd" event={"ID":"f1943401-a75b-4e45-8c65-3cc36018d8c4","Type":"ContainerStarted","Data":"a1fb251e8a8b94550e839880c3132ac217feef64b05bf3724ac2fd1dae8ca398"} Mar 19 09:25:11.313222 master-0 kubenswrapper[15202]: I0319 09:25:11.313186 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerStarted","Data":"177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36"} Mar 19 09:25:11.366283 master-0 kubenswrapper[15202]: I0319 09:25:11.366219 15202 scope.go:117] "RemoveContainer" containerID="e883f3efaa74902ad874d396550b0cb01d1872a885c29b583da8ebef350866c4" Mar 19 09:25:12.319981 master-0 kubenswrapper[15202]: I0319 09:25:12.319844 15202 generic.go:334] "Generic (PLEG): container finished" podID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" containerID="177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36" exitCode=0 Mar 19 09:25:12.319981 master-0 kubenswrapper[15202]: I0319 09:25:12.319920 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerDied","Data":"177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36"} Mar 19 09:25:12.319981 master-0 kubenswrapper[15202]: I0319 09:25:12.319956 15202 scope.go:117] "RemoveContainer" containerID="9eb1e9a14ebcec2e1a764be170f89e1ef614b1040813e25d65d1e6d7e567633c" Mar 19 09:25:12.320516 master-0 kubenswrapper[15202]: I0319 09:25:12.320452 15202 scope.go:117] "RemoveContainer" containerID="177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36" Mar 19 09:25:12.320688 master-0 kubenswrapper[15202]: E0319 09:25:12.320658 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-68bf6ff9d6-wshz8_openshift-insights(0cb70a30-a8d1-4037-81e6-eb4f0510a234)\"" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" podUID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" Mar 19 09:25:12.322876 master-0 kubenswrapper[15202]: I0319 09:25:12.322848 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tkx45" event={"ID":"dd69fc33-59d4-4538-b4ec-e2d08ac11f72","Type":"ContainerStarted","Data":"815484526bd51ef4ae03261dd4c8a3eb1ec0b3969103382528f597aff2e7f679"} Mar 19 09:25:12.328971 master-0 kubenswrapper[15202]: I0319 09:25:12.328916 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzz6n" event={"ID":"39bf78ac-304b-4b82-8729-d184657ef3bb","Type":"ContainerStarted","Data":"dce7e22bba2765a541be6a99d2df4a773e5cb1ab87a5b1992b529f80101175d0"} Mar 19 09:25:12.331620 master-0 kubenswrapper[15202]: I0319 09:25:12.331587 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/2.log" Mar 19 09:25:12.334115 master-0 kubenswrapper[15202]: I0319 09:25:12.334084 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/2.log" Mar 19 09:25:12.334840 master-0 kubenswrapper[15202]: I0319 09:25:12.334813 15202 scope.go:117] "RemoveContainer" containerID="d04b5565f460914cbdf914be37a041423b677254e05486f3d50bf2995c7a798e" Mar 19 09:25:12.335021 master-0 kubenswrapper[15202]: E0319 09:25:12.334989 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-866dc4744-hzrg4_openshift-machine-api(d32541c9-eef6-417c-9f5a-a7392dc70aa0)\"" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" podUID="d32541c9-eef6-417c-9f5a-a7392dc70aa0" Mar 19 09:25:12.336575 master-0 kubenswrapper[15202]: I0319 09:25:12.336540 15202 generic.go:334] "Generic (PLEG): container finished" podID="f1943401-a75b-4e45-8c65-3cc36018d8c4" containerID="a1fb251e8a8b94550e839880c3132ac217feef64b05bf3724ac2fd1dae8ca398" exitCode=0 Mar 19 09:25:12.336610 master-0 kubenswrapper[15202]: I0319 09:25:12.336598 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpvpd" event={"ID":"f1943401-a75b-4e45-8c65-3cc36018d8c4","Type":"ContainerDied","Data":"a1fb251e8a8b94550e839880c3132ac217feef64b05bf3724ac2fd1dae8ca398"} Mar 19 09:25:12.338777 master-0 kubenswrapper[15202]: I0319 09:25:12.338745 15202 generic.go:334] "Generic (PLEG): container finished" podID="89b0e82c-1cd1-45aa-9cab-2d11320a1ff7" containerID="bf9931eb9b4d4998b4243f7c5ac64c447cf5145415978e4383c17e3a99f32a61" exitCode=0 Mar 19 09:25:12.339296 master-0 kubenswrapper[15202]: I0319 09:25:12.339265 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqngb" event={"ID":"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7","Type":"ContainerDied","Data":"bf9931eb9b4d4998b4243f7c5ac64c447cf5145415978e4383c17e3a99f32a61"} Mar 19 09:25:12.759599 master-0 kubenswrapper[15202]: I0319 09:25:12.759516 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=3.759456212 podStartE2EDuration="3.759456212s" podCreationTimestamp="2026-03-19 09:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:12.758299285 +0000 UTC m=+30.143714101" watchObservedRunningTime="2026-03-19 09:25:12.759456212 +0000 UTC m=+30.144871018" Mar 19 09:25:13.083194 master-0 kubenswrapper[15202]: I0319 09:25:13.083143 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:13.219178 master-0 kubenswrapper[15202]: I0319 09:25:13.219118 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") pod \"ce38ec35-8f00-4060-a620-1759a6bbef66\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " Mar 19 09:25:13.219347 master-0 kubenswrapper[15202]: I0319 09:25:13.219260 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") pod \"ce38ec35-8f00-4060-a620-1759a6bbef66\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " Mar 19 09:25:13.219347 master-0 kubenswrapper[15202]: I0319 09:25:13.219333 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") pod \"ce38ec35-8f00-4060-a620-1759a6bbef66\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " Mar 19 09:25:13.219426 master-0 kubenswrapper[15202]: I0319 09:25:13.219369 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") pod \"ce38ec35-8f00-4060-a620-1759a6bbef66\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " Mar 19 09:25:13.220280 master-0 kubenswrapper[15202]: I0319 09:25:13.220226 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "ce38ec35-8f00-4060-a620-1759a6bbef66" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:25:13.220344 master-0 kubenswrapper[15202]: I0319 09:25:13.220332 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") pod \"ce38ec35-8f00-4060-a620-1759a6bbef66\" (UID: \"ce38ec35-8f00-4060-a620-1759a6bbef66\") " Mar 19 09:25:13.220479 master-0 kubenswrapper[15202]: I0319 09:25:13.220438 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images" (OuterVolumeSpecName: "images") pod "ce38ec35-8f00-4060-a620-1759a6bbef66" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:25:13.220668 master-0 kubenswrapper[15202]: I0319 09:25:13.220645 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "ce38ec35-8f00-4060-a620-1759a6bbef66" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:13.220939 master-0 kubenswrapper[15202]: I0319 09:25:13.220913 15202 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:13.220939 master-0 kubenswrapper[15202]: I0319 09:25:13.220935 15202 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ce38ec35-8f00-4060-a620-1759a6bbef66-images\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:13.221014 master-0 kubenswrapper[15202]: I0319 09:25:13.220945 15202 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce38ec35-8f00-4060-a620-1759a6bbef66-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:13.226544 master-0 kubenswrapper[15202]: I0319 09:25:13.223053 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "ce38ec35-8f00-4060-a620-1759a6bbef66" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:25:13.237452 master-0 kubenswrapper[15202]: I0319 09:25:13.237060 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5" (OuterVolumeSpecName: "kube-api-access-w6qs5") pod "ce38ec35-8f00-4060-a620-1759a6bbef66" (UID: "ce38ec35-8f00-4060-a620-1759a6bbef66"). InnerVolumeSpecName "kube-api-access-w6qs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:25:13.324550 master-0 kubenswrapper[15202]: I0319 09:25:13.321643 15202 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce38ec35-8f00-4060-a620-1759a6bbef66-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:13.324550 master-0 kubenswrapper[15202]: I0319 09:25:13.321697 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6qs5\" (UniqueName: \"kubernetes.io/projected/ce38ec35-8f00-4060-a620-1759a6bbef66-kube-api-access-w6qs5\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:13.346554 master-0 kubenswrapper[15202]: I0319 09:25:13.345953 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqngb" event={"ID":"89b0e82c-1cd1-45aa-9cab-2d11320a1ff7","Type":"ContainerStarted","Data":"b9723aff563fbcf8b1ae825433209fe78b626d552e8326831816095a9ec805bc"} Mar 19 09:25:13.348355 master-0 kubenswrapper[15202]: I0319 09:25:13.348306 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" event={"ID":"ce38ec35-8f00-4060-a620-1759a6bbef66","Type":"ContainerDied","Data":"633fccc65fe5856fecc01dbcc7e58f5190eed4eb98e5e73385a0e9bcc0746e0e"} Mar 19 09:25:13.348430 master-0 kubenswrapper[15202]: I0319 09:25:13.348404 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp" Mar 19 09:25:13.350076 master-0 kubenswrapper[15202]: I0319 09:25:13.350021 15202 scope.go:117] "RemoveContainer" containerID="177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36" Mar 19 09:25:13.355828 master-0 kubenswrapper[15202]: E0319 09:25:13.355604 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-68bf6ff9d6-wshz8_openshift-insights(0cb70a30-a8d1-4037-81e6-eb4f0510a234)\"" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" podUID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" Mar 19 09:25:13.357084 master-0 kubenswrapper[15202]: I0319 09:25:13.356907 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpvpd" event={"ID":"f1943401-a75b-4e45-8c65-3cc36018d8c4","Type":"ContainerStarted","Data":"6aee32d7a82d4e59358079ba394a2535bf24ddd5316cfc0ec229b81960bf26ae"} Mar 19 09:25:13.435239 master-0 kubenswrapper[15202]: I0319 09:25:13.435163 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp"] Mar 19 09:25:13.440553 master-0 kubenswrapper[15202]: I0319 09:25:13.439850 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-qrrhp"] Mar 19 09:25:14.374214 master-0 kubenswrapper[15202]: I0319 09:25:14.374157 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:14.374821 master-0 kubenswrapper[15202]: I0319 09:25:14.374347 15202 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:25:14.392950 master-0 kubenswrapper[15202]: I0319 09:25:14.392906 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwjzr" Mar 19 09:25:14.820314 master-0 kubenswrapper[15202]: I0319 09:25:14.820270 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce38ec35-8f00-4060-a620-1759a6bbef66" path="/var/lib/kubelet/pods/ce38ec35-8f00-4060-a620-1759a6bbef66/volumes" Mar 19 09:25:16.124924 master-0 kubenswrapper[15202]: I0319 09:25:16.124623 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:16.124924 master-0 kubenswrapper[15202]: I0319 09:25:16.124725 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:16.124924 master-0 kubenswrapper[15202]: I0319 09:25:16.124860 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:16.124924 master-0 kubenswrapper[15202]: I0319 09:25:16.124888 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:16.131459 master-0 kubenswrapper[15202]: I0319 09:25:16.130064 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:16.131459 master-0 kubenswrapper[15202]: I0319 09:25:16.130120 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:16.164925 master-0 kubenswrapper[15202]: I0319 09:25:16.164882 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:16.172791 master-0 kubenswrapper[15202]: I0319 09:25:16.172747 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:17.173550 master-0 kubenswrapper[15202]: I0319 09:25:17.173481 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zpvpd" podUID="f1943401-a75b-4e45-8c65-3cc36018d8c4" containerName="registry-server" probeResult="failure" output=< Mar 19 09:25:17.173550 master-0 kubenswrapper[15202]: timeout: failed to connect service ":50051" within 1s Mar 19 09:25:17.173550 master-0 kubenswrapper[15202]: > Mar 19 09:25:17.419621 master-0 kubenswrapper[15202]: I0319 09:25:17.419567 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tkx45" Mar 19 09:25:17.419621 master-0 kubenswrapper[15202]: I0319 09:25:17.419627 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wzz6n" Mar 19 09:25:17.926518 master-0 kubenswrapper[15202]: I0319 09:25:17.926447 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:17.926942 master-0 kubenswrapper[15202]: I0319 09:25:17.926861 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:17.969057 master-0 kubenswrapper[15202]: I0319 09:25:17.969014 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:18.419591 master-0 kubenswrapper[15202]: I0319 09:25:18.419531 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wqngb" Mar 19 09:25:20.367693 master-0 kubenswrapper[15202]: I0319 09:25:20.367644 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt"] Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.367893 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.367905 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.367924 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.367931 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.367945 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89be0036-a2c8-48b4-9eaf-17fab972c4f4" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.367951 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="89be0036-a2c8-48b4-9eaf-17fab972c4f4" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.367960 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.367966 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.367977 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.367983 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.367995 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368001 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.368010 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc248e59-1519-4ac3-9005-2239214a8d62" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368018 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc248e59-1519-4ac3-9005-2239214a8d62" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: E0319 09:25:20.368026 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368032 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368123 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="259aa9cc-51a9-498e-b099-ba4d781801c5" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368142 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c252745a-f6dc-4e94-a4b2-fbf21c9602ee" containerName="prober" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368155 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df23b55-3dea-4f5e-9d53-5c7755ea4e48" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368166 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc248e59-1519-4ac3-9005-2239214a8d62" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368174 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de53594-9dcc-4318-806a-64f39ef76b3b" containerName="installer" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368183 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b9d3-27c2-4c42-ae8b-28e40570b3c2" containerName="assisted-installer-controller" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368192 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 09:25:20.368232 master-0 kubenswrapper[15202]: I0319 09:25:20.368201 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="89be0036-a2c8-48b4-9eaf-17fab972c4f4" containerName="installer" Mar 19 09:25:20.369243 master-0 kubenswrapper[15202]: I0319 09:25:20.368963 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.378823 master-0 kubenswrapper[15202]: I0319 09:25:20.378774 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:25:20.380035 master-0 kubenswrapper[15202]: I0319 09:25:20.379882 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:25:20.380035 master-0 kubenswrapper[15202]: I0319 09:25:20.379882 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:25:20.383985 master-0 kubenswrapper[15202]: I0319 09:25:20.383898 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-zsf7l" Mar 19 09:25:20.399171 master-0 kubenswrapper[15202]: I0319 09:25:20.399101 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:25:20.399427 master-0 kubenswrapper[15202]: I0319 09:25:20.399336 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:25:20.413682 master-0 kubenswrapper[15202]: I0319 09:25:20.413627 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4149b83-964c-4bd2-9769-44c7b9da0a52-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.413934 master-0 kubenswrapper[15202]: I0319 09:25:20.413693 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4149b83-964c-4bd2-9769-44c7b9da0a52-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.413934 master-0 kubenswrapper[15202]: I0319 09:25:20.413719 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v54z4\" (UniqueName: \"kubernetes.io/projected/a4149b83-964c-4bd2-9769-44c7b9da0a52-kube-api-access-v54z4\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.413934 master-0 kubenswrapper[15202]: I0319 09:25:20.413789 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4149b83-964c-4bd2-9769-44c7b9da0a52-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.413934 master-0 kubenswrapper[15202]: I0319 09:25:20.413816 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4149b83-964c-4bd2-9769-44c7b9da0a52-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.514885 master-0 kubenswrapper[15202]: I0319 09:25:20.514816 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4149b83-964c-4bd2-9769-44c7b9da0a52-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.515171 master-0 kubenswrapper[15202]: I0319 09:25:20.515115 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4149b83-964c-4bd2-9769-44c7b9da0a52-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.515218 master-0 kubenswrapper[15202]: I0319 09:25:20.515199 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v54z4\" (UniqueName: \"kubernetes.io/projected/a4149b83-964c-4bd2-9769-44c7b9da0a52-kube-api-access-v54z4\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.515598 master-0 kubenswrapper[15202]: I0319 09:25:20.515559 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4149b83-964c-4bd2-9769-44c7b9da0a52-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.515684 master-0 kubenswrapper[15202]: I0319 09:25:20.515670 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4149b83-964c-4bd2-9769-44c7b9da0a52-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.515896 master-0 kubenswrapper[15202]: I0319 09:25:20.515847 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4149b83-964c-4bd2-9769-44c7b9da0a52-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.515955 master-0 kubenswrapper[15202]: I0319 09:25:20.515907 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4149b83-964c-4bd2-9769-44c7b9da0a52-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.516286 master-0 kubenswrapper[15202]: I0319 09:25:20.516260 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4149b83-964c-4bd2-9769-44c7b9da0a52-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.519089 master-0 kubenswrapper[15202]: I0319 09:25:20.519050 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4149b83-964c-4bd2-9769-44c7b9da0a52-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.542682 master-0 kubenswrapper[15202]: I0319 09:25:20.542646 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v54z4\" (UniqueName: \"kubernetes.io/projected/a4149b83-964c-4bd2-9769-44c7b9da0a52-kube-api-access-v54z4\") pod \"cluster-cloud-controller-manager-operator-7dff898856-rz5nt\" (UID: \"a4149b83-964c-4bd2-9769-44c7b9da0a52\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:20.685239 master-0 kubenswrapper[15202]: I0319 09:25:20.685180 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" Mar 19 09:25:21.422781 master-0 kubenswrapper[15202]: I0319 09:25:21.422724 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerStarted","Data":"735721424ad64f75cfc79f2a38adb31c107586c1ff18c9e0dc56b5f0173c3489"} Mar 19 09:25:21.423157 master-0 kubenswrapper[15202]: I0319 09:25:21.422786 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerStarted","Data":"203dd22278a9f40fcddde6dd79a4ed7c4144f2d1f0a477ecc5c55b58667906c6"} Mar 19 09:25:21.423157 master-0 kubenswrapper[15202]: I0319 09:25:21.422798 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerStarted","Data":"9fb086fbbaa67724fd0a765cd6a63768e91ae23e5f32d423173bcbccddda9e6e"} Mar 19 09:25:22.256726 master-0 kubenswrapper[15202]: I0319 09:25:22.256671 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:25:22.436723 master-0 kubenswrapper[15202]: I0319 09:25:22.436647 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerStarted","Data":"9d368544743541fa4e61adf2ed3e233913fd155e9f1b71f0d52f0ddd68bb6f5e"} Mar 19 09:25:22.464272 master-0 kubenswrapper[15202]: I0319 09:25:22.464176 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" podStartSLOduration=2.464155505 podStartE2EDuration="2.464155505s" podCreationTimestamp="2026-03-19 09:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:22.462274441 +0000 UTC m=+39.847689257" watchObservedRunningTime="2026-03-19 09:25:22.464155505 +0000 UTC m=+39.849570331" Mar 19 09:25:25.812490 master-0 kubenswrapper[15202]: I0319 09:25:25.812411 15202 scope.go:117] "RemoveContainer" containerID="177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36" Mar 19 09:25:25.813087 master-0 kubenswrapper[15202]: I0319 09:25:25.812518 15202 scope.go:117] "RemoveContainer" containerID="d04b5565f460914cbdf914be37a041423b677254e05486f3d50bf2995c7a798e" Mar 19 09:25:26.174876 master-0 kubenswrapper[15202]: I0319 09:25:26.174790 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:26.217501 master-0 kubenswrapper[15202]: I0319 09:25:26.217429 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zpvpd" Mar 19 09:25:26.811804 master-0 kubenswrapper[15202]: I0319 09:25:26.811742 15202 scope.go:117] "RemoveContainer" containerID="d01c20a752c69ad4fdbf88d6635a40cc54a638ede023acdd8e476bc823088f26" Mar 19 09:25:27.470364 master-0 kubenswrapper[15202]: I0319 09:25:27.470319 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/2.log" Mar 19 09:25:27.471254 master-0 kubenswrapper[15202]: I0319 09:25:27.471227 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-hzrg4" event={"ID":"d32541c9-eef6-417c-9f5a-a7392dc70aa0","Type":"ContainerStarted","Data":"c05c0071f8d082f8e06ab51ea7cfd19e2aa33f8bf5691db9e8b0996e2e4f77b7"} Mar 19 09:25:27.473283 master-0 kubenswrapper[15202]: I0319 09:25:27.473236 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerStarted","Data":"436264327abe3325ff4b8c101407c4a1d8b93ad5d90afa55d96f0c001990b3fe"} Mar 19 09:25:28.482351 master-0 kubenswrapper[15202]: I0319 09:25:28.482305 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/2.log" Mar 19 09:25:28.483121 master-0 kubenswrapper[15202]: I0319 09:25:28.483084 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerStarted","Data":"f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01"} Mar 19 09:25:31.500839 master-0 kubenswrapper[15202]: I0319 09:25:31.500792 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/1.log" Mar 19 09:25:31.501558 master-0 kubenswrapper[15202]: I0319 09:25:31.501537 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/0.log" Mar 19 09:25:31.501623 master-0 kubenswrapper[15202]: I0319 09:25:31.501579 15202 generic.go:334] "Generic (PLEG): container finished" podID="6a8e2194-aba6-4929-a29c-47c63c8ff799" containerID="46871c9c3ca81cca6462ffff9ccbad93a04486b47c22835c7cced6225bc557cc" exitCode=1 Mar 19 09:25:31.501623 master-0 kubenswrapper[15202]: I0319 09:25:31.501613 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerDied","Data":"46871c9c3ca81cca6462ffff9ccbad93a04486b47c22835c7cced6225bc557cc"} Mar 19 09:25:31.501704 master-0 kubenswrapper[15202]: I0319 09:25:31.501656 15202 scope.go:117] "RemoveContainer" containerID="d43b2cecb46ee4d7282d2377662b9eb7bab83399567784e4db2c8496f2616648" Mar 19 09:25:31.502239 master-0 kubenswrapper[15202]: I0319 09:25:31.502215 15202 scope.go:117] "RemoveContainer" containerID="46871c9c3ca81cca6462ffff9ccbad93a04486b47c22835c7cced6225bc557cc" Mar 19 09:25:32.508851 master-0 kubenswrapper[15202]: I0319 09:25:32.508793 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/1.log" Mar 19 09:25:32.510017 master-0 kubenswrapper[15202]: I0319 09:25:32.509971 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-pgdrx" event={"ID":"6a8e2194-aba6-4929-a29c-47c63c8ff799","Type":"ContainerStarted","Data":"9b628603574a8331192a3882b7fb8616d3b762f3595436b6550ebd3eafdb3740"} Mar 19 09:25:33.870661 master-0 kubenswrapper[15202]: I0319 09:25:33.870595 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:25:33.871195 master-0 kubenswrapper[15202]: I0319 09:25:33.870915 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" containerName="startup-monitor" containerID="cri-o://528b303c1aa0e5650b031fceefeae9a2856d906d524b7139df21f2091e40d442" gracePeriod=5 Mar 19 09:25:39.570301 master-0 kubenswrapper[15202]: I0319 09:25:39.570251 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_4801b7b4c9bb4aca19f4e1af1002ed5d/startup-monitor/1.log" Mar 19 09:25:39.570817 master-0 kubenswrapper[15202]: I0319 09:25:39.570304 15202 generic.go:334] "Generic (PLEG): container finished" podID="4801b7b4c9bb4aca19f4e1af1002ed5d" containerID="528b303c1aa0e5650b031fceefeae9a2856d906d524b7139df21f2091e40d442" exitCode=137 Mar 19 09:25:42.788326 master-0 kubenswrapper[15202]: I0319 09:25:42.788256 15202 scope.go:117] "RemoveContainer" containerID="157ec68d28f9ad49e7460cf4325702e32a61a87e98a342a6b3f00e830966c9b0" Mar 19 09:25:42.812756 master-0 kubenswrapper[15202]: I0319 09:25:42.812713 15202 scope.go:117] "RemoveContainer" containerID="9b28c300e3439abe307f50e88ba8ce2d925b14966bafd61f93ba6a56066cd1f7" Mar 19 09:25:44.251669 master-0 kubenswrapper[15202]: I0319 09:25:44.251608 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-grltt"] Mar 19 09:25:44.252231 master-0 kubenswrapper[15202]: E0319 09:25:44.251899 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" containerName="startup-monitor" Mar 19 09:25:44.252231 master-0 kubenswrapper[15202]: I0319 09:25:44.251917 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" containerName="startup-monitor" Mar 19 09:25:44.252231 master-0 kubenswrapper[15202]: I0319 09:25:44.252057 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" containerName="startup-monitor" Mar 19 09:25:44.252572 master-0 kubenswrapper[15202]: I0319 09:25:44.252544 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.255214 master-0 kubenswrapper[15202]: I0319 09:25:44.255162 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:25:44.257270 master-0 kubenswrapper[15202]: I0319 09:25:44.257232 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:25:44.258225 master-0 kubenswrapper[15202]: I0319 09:25:44.258189 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:25:44.258301 master-0 kubenswrapper[15202]: I0319 09:25:44.258263 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:25:44.282005 master-0 kubenswrapper[15202]: I0319 09:25:44.281929 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:25:44.319145 master-0 kubenswrapper[15202]: I0319 09:25:44.319078 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269465d8-91d6-40d7-bfde-3eff9b93c1cf-trusted-ca\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.319145 master-0 kubenswrapper[15202]: I0319 09:25:44.319147 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269465d8-91d6-40d7-bfde-3eff9b93c1cf-config\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.319407 master-0 kubenswrapper[15202]: I0319 09:25:44.319187 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgxf4\" (UniqueName: \"kubernetes.io/projected/269465d8-91d6-40d7-bfde-3eff9b93c1cf-kube-api-access-jgxf4\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.319407 master-0 kubenswrapper[15202]: I0319 09:25:44.319220 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/269465d8-91d6-40d7-bfde-3eff9b93c1cf-serving-cert\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.319702 master-0 kubenswrapper[15202]: I0319 09:25:44.319641 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-grltt"] Mar 19 09:25:44.420720 master-0 kubenswrapper[15202]: I0319 09:25:44.420005 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/269465d8-91d6-40d7-bfde-3eff9b93c1cf-serving-cert\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.420939 master-0 kubenswrapper[15202]: I0319 09:25:44.420753 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269465d8-91d6-40d7-bfde-3eff9b93c1cf-trusted-ca\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.420939 master-0 kubenswrapper[15202]: I0319 09:25:44.420798 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269465d8-91d6-40d7-bfde-3eff9b93c1cf-config\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.421059 master-0 kubenswrapper[15202]: I0319 09:25:44.420998 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgxf4\" (UniqueName: \"kubernetes.io/projected/269465d8-91d6-40d7-bfde-3eff9b93c1cf-kube-api-access-jgxf4\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.421923 master-0 kubenswrapper[15202]: I0319 09:25:44.421884 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269465d8-91d6-40d7-bfde-3eff9b93c1cf-config\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.422061 master-0 kubenswrapper[15202]: I0319 09:25:44.421954 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269465d8-91d6-40d7-bfde-3eff9b93c1cf-trusted-ca\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.423041 master-0 kubenswrapper[15202]: I0319 09:25:44.423004 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/269465d8-91d6-40d7-bfde-3eff9b93c1cf-serving-cert\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.523268 master-0 kubenswrapper[15202]: I0319 09:25:44.522930 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgxf4\" (UniqueName: \"kubernetes.io/projected/269465d8-91d6-40d7-bfde-3eff9b93c1cf-kube-api-access-jgxf4\") pod \"console-operator-76b6568d85-grltt\" (UID: \"269465d8-91d6-40d7-bfde-3eff9b93c1cf\") " pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.569668 master-0 kubenswrapper[15202]: I0319 09:25:44.569609 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:44.989147 master-0 kubenswrapper[15202]: I0319 09:25:44.989092 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-grltt"] Mar 19 09:25:44.995548 master-0 kubenswrapper[15202]: W0319 09:25:44.995492 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod269465d8_91d6_40d7_bfde_3eff9b93c1cf.slice/crio-1afb57b77a4f2983b067ea32c98a0ce35abc0dd3300ad9ed5721b7e5ba413e50 WatchSource:0}: Error finding container 1afb57b77a4f2983b067ea32c98a0ce35abc0dd3300ad9ed5721b7e5ba413e50: Status 404 returned error can't find the container with id 1afb57b77a4f2983b067ea32c98a0ce35abc0dd3300ad9ed5721b7e5ba413e50 Mar 19 09:25:45.604931 master-0 kubenswrapper[15202]: I0319 09:25:45.604859 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-grltt" event={"ID":"269465d8-91d6-40d7-bfde-3eff9b93c1cf","Type":"ContainerStarted","Data":"1afb57b77a4f2983b067ea32c98a0ce35abc0dd3300ad9ed5721b7e5ba413e50"} Mar 19 09:25:47.615895 master-0 kubenswrapper[15202]: I0319 09:25:47.615780 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/0.log" Mar 19 09:25:47.615895 master-0 kubenswrapper[15202]: I0319 09:25:47.615828 15202 generic.go:334] "Generic (PLEG): container finished" podID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" containerID="98fee1aa55842ede127933bf5c2c806ba678c1d15593fd4d96be97cc9b4306a3" exitCode=255 Mar 19 09:25:47.615895 master-0 kubenswrapper[15202]: I0319 09:25:47.615856 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-grltt" event={"ID":"269465d8-91d6-40d7-bfde-3eff9b93c1cf","Type":"ContainerDied","Data":"98fee1aa55842ede127933bf5c2c806ba678c1d15593fd4d96be97cc9b4306a3"} Mar 19 09:25:47.616519 master-0 kubenswrapper[15202]: I0319 09:25:47.616316 15202 scope.go:117] "RemoveContainer" containerID="98fee1aa55842ede127933bf5c2c806ba678c1d15593fd4d96be97cc9b4306a3" Mar 19 09:25:48.622117 master-0 kubenswrapper[15202]: I0319 09:25:48.622068 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/1.log" Mar 19 09:25:48.623022 master-0 kubenswrapper[15202]: I0319 09:25:48.622975 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/0.log" Mar 19 09:25:48.623022 master-0 kubenswrapper[15202]: I0319 09:25:48.623013 15202 generic.go:334] "Generic (PLEG): container finished" podID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" containerID="308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1" exitCode=255 Mar 19 09:25:48.623135 master-0 kubenswrapper[15202]: I0319 09:25:48.623046 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-grltt" event={"ID":"269465d8-91d6-40d7-bfde-3eff9b93c1cf","Type":"ContainerDied","Data":"308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1"} Mar 19 09:25:48.623135 master-0 kubenswrapper[15202]: I0319 09:25:48.623092 15202 scope.go:117] "RemoveContainer" containerID="98fee1aa55842ede127933bf5c2c806ba678c1d15593fd4d96be97cc9b4306a3" Mar 19 09:25:48.624156 master-0 kubenswrapper[15202]: I0319 09:25:48.623785 15202 scope.go:117] "RemoveContainer" containerID="308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1" Mar 19 09:25:48.624156 master-0 kubenswrapper[15202]: E0319 09:25:48.624042 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:25:49.629598 master-0 kubenswrapper[15202]: I0319 09:25:49.629558 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/1.log" Mar 19 09:25:49.630162 master-0 kubenswrapper[15202]: I0319 09:25:49.630136 15202 scope.go:117] "RemoveContainer" containerID="308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1" Mar 19 09:25:49.630396 master-0 kubenswrapper[15202]: E0319 09:25:49.630357 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:25:50.541537 master-0 kubenswrapper[15202]: I0319 09:25:50.541494 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_4801b7b4c9bb4aca19f4e1af1002ed5d/startup-monitor/1.log" Mar 19 09:25:50.541733 master-0 kubenswrapper[15202]: I0319 09:25:50.541581 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:25:50.624312 master-0 kubenswrapper[15202]: I0319 09:25:50.624239 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") pod \"4801b7b4c9bb4aca19f4e1af1002ed5d\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " Mar 19 09:25:50.624548 master-0 kubenswrapper[15202]: I0319 09:25:50.624397 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") pod \"4801b7b4c9bb4aca19f4e1af1002ed5d\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " Mar 19 09:25:50.624548 master-0 kubenswrapper[15202]: I0319 09:25:50.624443 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") pod \"4801b7b4c9bb4aca19f4e1af1002ed5d\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " Mar 19 09:25:50.624641 master-0 kubenswrapper[15202]: I0319 09:25:50.624588 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") pod \"4801b7b4c9bb4aca19f4e1af1002ed5d\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " Mar 19 09:25:50.624641 master-0 kubenswrapper[15202]: I0319 09:25:50.624602 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log" (OuterVolumeSpecName: "var-log") pod "4801b7b4c9bb4aca19f4e1af1002ed5d" (UID: "4801b7b4c9bb4aca19f4e1af1002ed5d"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:50.624738 master-0 kubenswrapper[15202]: I0319 09:25:50.624650 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") pod \"4801b7b4c9bb4aca19f4e1af1002ed5d\" (UID: \"4801b7b4c9bb4aca19f4e1af1002ed5d\") " Mar 19 09:25:50.624738 master-0 kubenswrapper[15202]: I0319 09:25:50.624688 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests" (OuterVolumeSpecName: "manifests") pod "4801b7b4c9bb4aca19f4e1af1002ed5d" (UID: "4801b7b4c9bb4aca19f4e1af1002ed5d"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:50.624852 master-0 kubenswrapper[15202]: I0319 09:25:50.624778 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock" (OuterVolumeSpecName: "var-lock") pod "4801b7b4c9bb4aca19f4e1af1002ed5d" (UID: "4801b7b4c9bb4aca19f4e1af1002ed5d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:50.624903 master-0 kubenswrapper[15202]: I0319 09:25:50.624802 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "4801b7b4c9bb4aca19f4e1af1002ed5d" (UID: "4801b7b4c9bb4aca19f4e1af1002ed5d"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:50.625218 master-0 kubenswrapper[15202]: I0319 09:25:50.625174 15202 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:50.625218 master-0 kubenswrapper[15202]: I0319 09:25:50.625207 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:50.625324 master-0 kubenswrapper[15202]: I0319 09:25:50.625221 15202 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:50.625324 master-0 kubenswrapper[15202]: I0319 09:25:50.625240 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:50.630331 master-0 kubenswrapper[15202]: I0319 09:25:50.630273 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "4801b7b4c9bb4aca19f4e1af1002ed5d" (UID: "4801b7b4c9bb4aca19f4e1af1002ed5d"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:25:50.637331 master-0 kubenswrapper[15202]: I0319 09:25:50.637288 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_4801b7b4c9bb4aca19f4e1af1002ed5d/startup-monitor/1.log" Mar 19 09:25:50.637523 master-0 kubenswrapper[15202]: I0319 09:25:50.637374 15202 scope.go:117] "RemoveContainer" containerID="528b303c1aa0e5650b031fceefeae9a2856d906d524b7139df21f2091e40d442" Mar 19 09:25:50.637523 master-0 kubenswrapper[15202]: I0319 09:25:50.637425 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:25:50.726306 master-0 kubenswrapper[15202]: I0319 09:25:50.726158 15202 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/4801b7b4c9bb4aca19f4e1af1002ed5d-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:25:50.818660 master-0 kubenswrapper[15202]: I0319 09:25:50.818594 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4801b7b4c9bb4aca19f4e1af1002ed5d" path="/var/lib/kubelet/pods/4801b7b4c9bb4aca19f4e1af1002ed5d/volumes" Mar 19 09:25:50.818892 master-0 kubenswrapper[15202]: I0319 09:25:50.818868 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 19 09:25:50.830801 master-0 kubenswrapper[15202]: I0319 09:25:50.830746 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:25:50.830801 master-0 kubenswrapper[15202]: I0319 09:25:50.830786 15202 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="ac0c60b8-c714-468e-b7c5-b644d6b0bbc3" Mar 19 09:25:50.834001 master-0 kubenswrapper[15202]: I0319 09:25:50.833953 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:25:50.834001 master-0 kubenswrapper[15202]: I0319 09:25:50.833994 15202 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="ac0c60b8-c714-468e-b7c5-b644d6b0bbc3" Mar 19 09:25:54.570184 master-0 kubenswrapper[15202]: I0319 09:25:54.570119 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:54.570184 master-0 kubenswrapper[15202]: I0319 09:25:54.570179 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:25:54.570819 master-0 kubenswrapper[15202]: I0319 09:25:54.570696 15202 scope.go:117] "RemoveContainer" containerID="308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1" Mar 19 09:25:54.571012 master-0 kubenswrapper[15202]: E0319 09:25:54.570957 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:07.811691 master-0 kubenswrapper[15202]: I0319 09:26:07.811629 15202 scope.go:117] "RemoveContainer" containerID="308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1" Mar 19 09:26:08.303681 master-0 kubenswrapper[15202]: I0319 09:26:08.303591 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/2.log" Mar 19 09:26:08.304316 master-0 kubenswrapper[15202]: I0319 09:26:08.304282 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/1.log" Mar 19 09:26:08.304448 master-0 kubenswrapper[15202]: I0319 09:26:08.304424 15202 generic.go:334] "Generic (PLEG): container finished" podID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" containerID="217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4" exitCode=255 Mar 19 09:26:08.304577 master-0 kubenswrapper[15202]: I0319 09:26:08.304520 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-grltt" event={"ID":"269465d8-91d6-40d7-bfde-3eff9b93c1cf","Type":"ContainerDied","Data":"217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4"} Mar 19 09:26:08.304662 master-0 kubenswrapper[15202]: I0319 09:26:08.304651 15202 scope.go:117] "RemoveContainer" containerID="308ba1232c61ff53b825d9a60ec44723fe3d13d6412970a2ea3f63ba0cc652a1" Mar 19 09:26:08.305208 master-0 kubenswrapper[15202]: I0319 09:26:08.305195 15202 scope.go:117] "RemoveContainer" containerID="217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4" Mar 19 09:26:08.305456 master-0 kubenswrapper[15202]: E0319 09:26:08.305439 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:09.310824 master-0 kubenswrapper[15202]: I0319 09:26:09.310791 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/2.log" Mar 19 09:26:14.570322 master-0 kubenswrapper[15202]: I0319 09:26:14.570243 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:26:14.570841 master-0 kubenswrapper[15202]: I0319 09:26:14.570423 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:26:14.570908 master-0 kubenswrapper[15202]: I0319 09:26:14.570879 15202 scope.go:117] "RemoveContainer" containerID="217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4" Mar 19 09:26:14.571142 master-0 kubenswrapper[15202]: E0319 09:26:14.571110 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:15.344578 master-0 kubenswrapper[15202]: I0319 09:26:15.344489 15202 scope.go:117] "RemoveContainer" containerID="217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4" Mar 19 09:26:15.344864 master-0 kubenswrapper[15202]: E0319 09:26:15.344699 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:25.397972 master-0 kubenswrapper[15202]: I0319 09:26:25.397901 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f9655dc5d-8lp25"] Mar 19 09:26:25.399018 master-0 kubenswrapper[15202]: I0319 09:26:25.398165 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" podUID="01d017ee-b94e-402f-90c1-ccb3f336b2a8" containerName="controller-manager" containerID="cri-o://00f2488d3b13e4e27e3e63246f1f84387bf26e062f88b6e05117ccf0841ee905" gracePeriod=30 Mar 19 09:26:25.553737 master-0 kubenswrapper[15202]: I0319 09:26:25.553672 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb"] Mar 19 09:26:25.554001 master-0 kubenswrapper[15202]: I0319 09:26:25.553893 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" podUID="fedd4b33-c90e-42d5-bc29-73d1701bb671" containerName="route-controller-manager" containerID="cri-o://e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d" gracePeriod=30 Mar 19 09:26:26.209505 master-0 kubenswrapper[15202]: I0319 09:26:26.209404 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:26:26.383587 master-0 kubenswrapper[15202]: I0319 09:26:26.382896 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") pod \"fedd4b33-c90e-42d5-bc29-73d1701bb671\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " Mar 19 09:26:26.383587 master-0 kubenswrapper[15202]: I0319 09:26:26.382968 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") pod \"fedd4b33-c90e-42d5-bc29-73d1701bb671\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " Mar 19 09:26:26.383587 master-0 kubenswrapper[15202]: I0319 09:26:26.383009 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") pod \"fedd4b33-c90e-42d5-bc29-73d1701bb671\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " Mar 19 09:26:26.383587 master-0 kubenswrapper[15202]: I0319 09:26:26.383063 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") pod \"fedd4b33-c90e-42d5-bc29-73d1701bb671\" (UID: \"fedd4b33-c90e-42d5-bc29-73d1701bb671\") " Mar 19 09:26:26.383990 master-0 kubenswrapper[15202]: I0319 09:26:26.383900 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config" (OuterVolumeSpecName: "config") pod "fedd4b33-c90e-42d5-bc29-73d1701bb671" (UID: "fedd4b33-c90e-42d5-bc29-73d1701bb671"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:26.394334 master-0 kubenswrapper[15202]: I0319 09:26:26.384042 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca" (OuterVolumeSpecName: "client-ca") pod "fedd4b33-c90e-42d5-bc29-73d1701bb671" (UID: "fedd4b33-c90e-42d5-bc29-73d1701bb671"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:26.406438 master-0 kubenswrapper[15202]: I0319 09:26:26.405009 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn" (OuterVolumeSpecName: "kube-api-access-2p6wn") pod "fedd4b33-c90e-42d5-bc29-73d1701bb671" (UID: "fedd4b33-c90e-42d5-bc29-73d1701bb671"). InnerVolumeSpecName "kube-api-access-2p6wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:26:26.406438 master-0 kubenswrapper[15202]: I0319 09:26:26.405295 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fedd4b33-c90e-42d5-bc29-73d1701bb671" (UID: "fedd4b33-c90e-42d5-bc29-73d1701bb671"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:26:26.421873 master-0 kubenswrapper[15202]: I0319 09:26:26.421680 15202 generic.go:334] "Generic (PLEG): container finished" podID="01d017ee-b94e-402f-90c1-ccb3f336b2a8" containerID="00f2488d3b13e4e27e3e63246f1f84387bf26e062f88b6e05117ccf0841ee905" exitCode=0 Mar 19 09:26:26.421873 master-0 kubenswrapper[15202]: I0319 09:26:26.421762 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" event={"ID":"01d017ee-b94e-402f-90c1-ccb3f336b2a8","Type":"ContainerDied","Data":"00f2488d3b13e4e27e3e63246f1f84387bf26e062f88b6e05117ccf0841ee905"} Mar 19 09:26:26.423544 master-0 kubenswrapper[15202]: I0319 09:26:26.423354 15202 generic.go:334] "Generic (PLEG): container finished" podID="fedd4b33-c90e-42d5-bc29-73d1701bb671" containerID="e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d" exitCode=0 Mar 19 09:26:26.424754 master-0 kubenswrapper[15202]: I0319 09:26:26.424619 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" Mar 19 09:26:26.424754 master-0 kubenswrapper[15202]: I0319 09:26:26.424647 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" event={"ID":"fedd4b33-c90e-42d5-bc29-73d1701bb671","Type":"ContainerDied","Data":"e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d"} Mar 19 09:26:26.424754 master-0 kubenswrapper[15202]: I0319 09:26:26.424683 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb" event={"ID":"fedd4b33-c90e-42d5-bc29-73d1701bb671","Type":"ContainerDied","Data":"dd1819a433e70ea4c2b01b165e8a76f6644d7959ff5dbef7efb1f362b56038c1"} Mar 19 09:26:26.424754 master-0 kubenswrapper[15202]: I0319 09:26:26.424713 15202 scope.go:117] "RemoveContainer" containerID="e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d" Mar 19 09:26:26.439489 master-0 kubenswrapper[15202]: I0319 09:26:26.439321 15202 scope.go:117] "RemoveContainer" containerID="e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d" Mar 19 09:26:26.439814 master-0 kubenswrapper[15202]: E0319 09:26:26.439769 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d\": container with ID starting with e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d not found: ID does not exist" containerID="e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d" Mar 19 09:26:26.439928 master-0 kubenswrapper[15202]: I0319 09:26:26.439809 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d"} err="failed to get container status \"e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d\": rpc error: code = NotFound desc = could not find container \"e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d\": container with ID starting with e43a9e253f5e88c86c32c7a6dfbc3fc597d1400e7e3d7eb0af474dccb37fd22d not found: ID does not exist" Mar 19 09:26:26.458521 master-0 kubenswrapper[15202]: I0319 09:26:26.456838 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb"] Mar 19 09:26:26.458521 master-0 kubenswrapper[15202]: I0319 09:26:26.458423 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff75bdd67-drxcb"] Mar 19 09:26:26.474610 master-0 kubenswrapper[15202]: I0319 09:26:26.474572 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:26:26.484295 master-0 kubenswrapper[15202]: I0319 09:26:26.484186 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.484417 master-0 kubenswrapper[15202]: I0319 09:26:26.484330 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fedd4b33-c90e-42d5-bc29-73d1701bb671-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.484417 master-0 kubenswrapper[15202]: I0319 09:26:26.484352 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p6wn\" (UniqueName: \"kubernetes.io/projected/fedd4b33-c90e-42d5-bc29-73d1701bb671-kube-api-access-2p6wn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.484417 master-0 kubenswrapper[15202]: I0319 09:26:26.484369 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fedd4b33-c90e-42d5-bc29-73d1701bb671-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.585039 master-0 kubenswrapper[15202]: I0319 09:26:26.584930 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") pod \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " Mar 19 09:26:26.585039 master-0 kubenswrapper[15202]: I0319 09:26:26.585002 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") pod \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " Mar 19 09:26:26.585039 master-0 kubenswrapper[15202]: I0319 09:26:26.585036 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") pod \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " Mar 19 09:26:26.586245 master-0 kubenswrapper[15202]: I0319 09:26:26.585073 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") pod \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " Mar 19 09:26:26.586245 master-0 kubenswrapper[15202]: I0319 09:26:26.585175 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") pod \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\" (UID: \"01d017ee-b94e-402f-90c1-ccb3f336b2a8\") " Mar 19 09:26:26.586245 master-0 kubenswrapper[15202]: I0319 09:26:26.586041 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "01d017ee-b94e-402f-90c1-ccb3f336b2a8" (UID: "01d017ee-b94e-402f-90c1-ccb3f336b2a8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:26.586977 master-0 kubenswrapper[15202]: I0319 09:26:26.586806 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config" (OuterVolumeSpecName: "config") pod "01d017ee-b94e-402f-90c1-ccb3f336b2a8" (UID: "01d017ee-b94e-402f-90c1-ccb3f336b2a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:26.587418 master-0 kubenswrapper[15202]: I0319 09:26:26.587347 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "01d017ee-b94e-402f-90c1-ccb3f336b2a8" (UID: "01d017ee-b94e-402f-90c1-ccb3f336b2a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:26.591241 master-0 kubenswrapper[15202]: I0319 09:26:26.590179 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01d017ee-b94e-402f-90c1-ccb3f336b2a8" (UID: "01d017ee-b94e-402f-90c1-ccb3f336b2a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:26:26.591241 master-0 kubenswrapper[15202]: I0319 09:26:26.590290 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8" (OuterVolumeSpecName: "kube-api-access-sqzn8") pod "01d017ee-b94e-402f-90c1-ccb3f336b2a8" (UID: "01d017ee-b94e-402f-90c1-ccb3f336b2a8"). InnerVolumeSpecName "kube-api-access-sqzn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:26:26.687622 master-0 kubenswrapper[15202]: I0319 09:26:26.687510 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqzn8\" (UniqueName: \"kubernetes.io/projected/01d017ee-b94e-402f-90c1-ccb3f336b2a8-kube-api-access-sqzn8\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.687622 master-0 kubenswrapper[15202]: I0319 09:26:26.687599 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01d017ee-b94e-402f-90c1-ccb3f336b2a8-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.687622 master-0 kubenswrapper[15202]: I0319 09:26:26.687625 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.687622 master-0 kubenswrapper[15202]: I0319 09:26:26.687650 15202 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.688163 master-0 kubenswrapper[15202]: I0319 09:26:26.687674 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d017ee-b94e-402f-90c1-ccb3f336b2a8-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:26:26.820815 master-0 kubenswrapper[15202]: I0319 09:26:26.820717 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fedd4b33-c90e-42d5-bc29-73d1701bb671" path="/var/lib/kubelet/pods/fedd4b33-c90e-42d5-bc29-73d1701bb671/volumes" Mar 19 09:26:27.184643 master-0 kubenswrapper[15202]: I0319 09:26:27.184566 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58fff6b545-fvbrw"] Mar 19 09:26:27.184848 master-0 kubenswrapper[15202]: E0319 09:26:27.184827 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d017ee-b94e-402f-90c1-ccb3f336b2a8" containerName="controller-manager" Mar 19 09:26:27.184848 master-0 kubenswrapper[15202]: I0319 09:26:27.184845 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d017ee-b94e-402f-90c1-ccb3f336b2a8" containerName="controller-manager" Mar 19 09:26:27.184914 master-0 kubenswrapper[15202]: E0319 09:26:27.184861 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedd4b33-c90e-42d5-bc29-73d1701bb671" containerName="route-controller-manager" Mar 19 09:26:27.184914 master-0 kubenswrapper[15202]: I0319 09:26:27.184869 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedd4b33-c90e-42d5-bc29-73d1701bb671" containerName="route-controller-manager" Mar 19 09:26:27.185031 master-0 kubenswrapper[15202]: I0319 09:26:27.185000 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d017ee-b94e-402f-90c1-ccb3f336b2a8" containerName="controller-manager" Mar 19 09:26:27.185074 master-0 kubenswrapper[15202]: I0319 09:26:27.185044 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedd4b33-c90e-42d5-bc29-73d1701bb671" containerName="route-controller-manager" Mar 19 09:26:27.185509 master-0 kubenswrapper[15202]: I0319 09:26:27.185435 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.187619 master-0 kubenswrapper[15202]: I0319 09:26:27.187519 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd"] Mar 19 09:26:27.187619 master-0 kubenswrapper[15202]: W0319 09:26:27.187569 15202 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-57xnh": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-57xnh" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'master-0' and this object Mar 19 09:26:27.187718 master-0 kubenswrapper[15202]: E0319 09:26:27.187668 15202 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-57xnh\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-57xnh\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:26:27.187996 master-0 kubenswrapper[15202]: I0319 09:26:27.187963 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.190375 master-0 kubenswrapper[15202]: W0319 09:26:27.190299 15202 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'master-0' and this object Mar 19 09:26:27.190522 master-0 kubenswrapper[15202]: E0319 09:26:27.190416 15202 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:26:27.190617 master-0 kubenswrapper[15202]: W0319 09:26:27.190574 15202 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'master-0' and this object Mar 19 09:26:27.190679 master-0 kubenswrapper[15202]: E0319 09:26:27.190636 15202 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:26:27.191212 master-0 kubenswrapper[15202]: I0319 09:26:27.191157 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:26:27.192258 master-0 kubenswrapper[15202]: W0319 09:26:27.192207 15202 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'master-0' and this object Mar 19 09:26:27.192330 master-0 kubenswrapper[15202]: E0319 09:26:27.192273 15202 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:26:27.192549 master-0 kubenswrapper[15202]: I0319 09:26:27.192519 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-tgqwm" Mar 19 09:26:27.194602 master-0 kubenswrapper[15202]: W0319 09:26:27.194571 15202 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'master-0' and this object Mar 19 09:26:27.194683 master-0 kubenswrapper[15202]: E0319 09:26:27.194607 15202 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 19 09:26:27.209937 master-0 kubenswrapper[15202]: I0319 09:26:27.209885 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58fff6b545-fvbrw"] Mar 19 09:26:27.276445 master-0 kubenswrapper[15202]: I0319 09:26:27.272322 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd"] Mar 19 09:26:27.297321 master-0 kubenswrapper[15202]: I0319 09:26:27.297236 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-proxy-ca-bundles\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.297321 master-0 kubenswrapper[15202]: I0319 09:26:27.297314 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-client-ca\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.297321 master-0 kubenswrapper[15202]: I0319 09:26:27.297342 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-config\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.297824 master-0 kubenswrapper[15202]: I0319 09:26:27.297372 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcv9p\" (UniqueName: \"kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.297824 master-0 kubenswrapper[15202]: I0319 09:26:27.297409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-config\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.297824 master-0 kubenswrapper[15202]: I0319 09:26:27.297429 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58349d4-1322-4ebe-a513-146773f77a4b-serving-cert\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.297824 master-0 kubenswrapper[15202]: I0319 09:26:27.297445 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.297824 master-0 kubenswrapper[15202]: I0319 09:26:27.297490 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxjgf\" (UniqueName: \"kubernetes.io/projected/e58349d4-1322-4ebe-a513-146773f77a4b-kube-api-access-bxjgf\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.297824 master-0 kubenswrapper[15202]: I0319 09:26:27.297535 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80dad6a1-700f-4953-88e2-edc17468af14-serving-cert\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.399143 master-0 kubenswrapper[15202]: I0319 09:26:27.399059 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-client-ca\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399260 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-config\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399299 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcv9p\" (UniqueName: \"kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399338 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-config\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399363 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58349d4-1322-4ebe-a513-146773f77a4b-serving-cert\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399380 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399407 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxjgf\" (UniqueName: \"kubernetes.io/projected/e58349d4-1322-4ebe-a513-146773f77a4b-kube-api-access-bxjgf\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399425 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80dad6a1-700f-4953-88e2-edc17468af14-serving-cert\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:27.399533 master-0 kubenswrapper[15202]: I0319 09:26:27.399460 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-proxy-ca-bundles\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.400284 master-0 kubenswrapper[15202]: I0319 09:26:27.400242 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-client-ca\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.401329 master-0 kubenswrapper[15202]: I0319 09:26:27.401250 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-config\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.402142 master-0 kubenswrapper[15202]: I0319 09:26:27.402112 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-proxy-ca-bundles\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.403756 master-0 kubenswrapper[15202]: I0319 09:26:27.403671 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58349d4-1322-4ebe-a513-146773f77a4b-serving-cert\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.417651 master-0 kubenswrapper[15202]: I0319 09:26:27.415779 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxjgf\" (UniqueName: \"kubernetes.io/projected/e58349d4-1322-4ebe-a513-146773f77a4b-kube-api-access-bxjgf\") pod \"controller-manager-58fff6b545-fvbrw\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:27.434306 master-0 kubenswrapper[15202]: I0319 09:26:27.434229 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" event={"ID":"01d017ee-b94e-402f-90c1-ccb3f336b2a8","Type":"ContainerDied","Data":"5995c7b8ffe029a08c3e66897be233bbf8a8cb34f50eb229308640e61c764207"} Mar 19 09:26:27.434306 master-0 kubenswrapper[15202]: I0319 09:26:27.434272 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9655dc5d-8lp25" Mar 19 09:26:27.434577 master-0 kubenswrapper[15202]: I0319 09:26:27.434327 15202 scope.go:117] "RemoveContainer" containerID="00f2488d3b13e4e27e3e63246f1f84387bf26e062f88b6e05117ccf0841ee905" Mar 19 09:26:27.461430 master-0 kubenswrapper[15202]: I0319 09:26:27.460368 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f9655dc5d-8lp25"] Mar 19 09:26:27.465475 master-0 kubenswrapper[15202]: I0319 09:26:27.465397 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f9655dc5d-8lp25"] Mar 19 09:26:28.162639 master-0 kubenswrapper[15202]: I0319 09:26:28.162579 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-57xnh" Mar 19 09:26:28.166105 master-0 kubenswrapper[15202]: I0319 09:26:28.166023 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:28.191781 master-0 kubenswrapper[15202]: I0319 09:26:28.191706 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:26:28.204545 master-0 kubenswrapper[15202]: I0319 09:26:28.203992 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80dad6a1-700f-4953-88e2-edc17468af14-serving-cert\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:28.382869 master-0 kubenswrapper[15202]: I0319 09:26:28.382795 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:26:28.392532 master-0 kubenswrapper[15202]: I0319 09:26:28.392482 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-config\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:28.400857 master-0 kubenswrapper[15202]: E0319 09:26:28.400784 15202 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:28.401014 master-0 kubenswrapper[15202]: E0319 09:26:28.400941 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca podName:80dad6a1-700f-4953-88e2-edc17468af14 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:28.90090717 +0000 UTC m=+106.286321986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca") pod "route-controller-manager-7f758fb97d-qmbkd" (UID: "80dad6a1-700f-4953-88e2-edc17468af14") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:28.423648 master-0 kubenswrapper[15202]: E0319 09:26:28.423427 15202 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:28.423648 master-0 kubenswrapper[15202]: E0319 09:26:28.423544 15202 projected.go:194] Error preparing data for projected volume kube-api-access-bcv9p for pod openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:28.424288 master-0 kubenswrapper[15202]: E0319 09:26:28.423654 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p podName:80dad6a1-700f-4953-88e2-edc17468af14 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:28.923620057 +0000 UTC m=+106.309035083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bcv9p" (UniqueName: "kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p") pod "route-controller-manager-7f758fb97d-qmbkd" (UID: "80dad6a1-700f-4953-88e2-edc17468af14") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:28.542159 master-0 kubenswrapper[15202]: I0319 09:26:28.542076 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:28.609599 master-0 kubenswrapper[15202]: I0319 09:26:28.609521 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:26:28.646734 master-0 kubenswrapper[15202]: I0319 09:26:28.646679 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58fff6b545-fvbrw"] Mar 19 09:26:28.654538 master-0 kubenswrapper[15202]: W0319 09:26:28.654435 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58349d4_1322_4ebe_a513_146773f77a4b.slice/crio-76323448a57e2c6d1dff9d86c363961b405e92b79f3ac8a4ce1eab048e4ba96a WatchSource:0}: Error finding container 76323448a57e2c6d1dff9d86c363961b405e92b79f3ac8a4ce1eab048e4ba96a: Status 404 returned error can't find the container with id 76323448a57e2c6d1dff9d86c363961b405e92b79f3ac8a4ce1eab048e4ba96a Mar 19 09:26:28.813589 master-0 kubenswrapper[15202]: I0319 09:26:28.813457 15202 scope.go:117] "RemoveContainer" containerID="217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4" Mar 19 09:26:28.820868 master-0 kubenswrapper[15202]: I0319 09:26:28.820785 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d017ee-b94e-402f-90c1-ccb3f336b2a8" path="/var/lib/kubelet/pods/01d017ee-b94e-402f-90c1-ccb3f336b2a8/volumes" Mar 19 09:26:28.924010 master-0 kubenswrapper[15202]: I0319 09:26:28.923901 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:28.925357 master-0 kubenswrapper[15202]: I0319 09:26:28.924731 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcv9p\" (UniqueName: \"kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:28.925357 master-0 kubenswrapper[15202]: I0319 09:26:28.925074 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:28.930379 master-0 kubenswrapper[15202]: I0319 09:26:28.930306 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcv9p\" (UniqueName: \"kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p\") pod \"route-controller-manager-7f758fb97d-qmbkd\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:29.080279 master-0 kubenswrapper[15202]: I0319 09:26:29.080214 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:29.450461 master-0 kubenswrapper[15202]: I0319 09:26:29.450413 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/3.log" Mar 19 09:26:29.451976 master-0 kubenswrapper[15202]: I0319 09:26:29.451913 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/2.log" Mar 19 09:26:29.452073 master-0 kubenswrapper[15202]: I0319 09:26:29.451979 15202 generic.go:334] "Generic (PLEG): container finished" podID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" exitCode=255 Mar 19 09:26:29.452073 master-0 kubenswrapper[15202]: I0319 09:26:29.452050 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-grltt" event={"ID":"269465d8-91d6-40d7-bfde-3eff9b93c1cf","Type":"ContainerDied","Data":"c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7"} Mar 19 09:26:29.452372 master-0 kubenswrapper[15202]: I0319 09:26:29.452089 15202 scope.go:117] "RemoveContainer" containerID="217bafd938bb17b031026664e3ffefbaad31bbf37fda5c1e6d52037d3e9e13b4" Mar 19 09:26:29.452911 master-0 kubenswrapper[15202]: I0319 09:26:29.452815 15202 scope.go:117] "RemoveContainer" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" Mar 19 09:26:29.453872 master-0 kubenswrapper[15202]: E0319 09:26:29.453127 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:29.457603 master-0 kubenswrapper[15202]: I0319 09:26:29.457540 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" event={"ID":"e58349d4-1322-4ebe-a513-146773f77a4b","Type":"ContainerStarted","Data":"ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730"} Mar 19 09:26:29.457603 master-0 kubenswrapper[15202]: I0319 09:26:29.457601 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" event={"ID":"e58349d4-1322-4ebe-a513-146773f77a4b","Type":"ContainerStarted","Data":"76323448a57e2c6d1dff9d86c363961b405e92b79f3ac8a4ce1eab048e4ba96a"} Mar 19 09:26:29.539672 master-0 kubenswrapper[15202]: I0319 09:26:29.539499 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" podStartSLOduration=4.539463041 podStartE2EDuration="4.539463041s" podCreationTimestamp="2026-03-19 09:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:29.537088519 +0000 UTC m=+106.922503335" watchObservedRunningTime="2026-03-19 09:26:29.539463041 +0000 UTC m=+106.924877847" Mar 19 09:26:29.554127 master-0 kubenswrapper[15202]: I0319 09:26:29.554047 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd"] Mar 19 09:26:29.561053 master-0 kubenswrapper[15202]: W0319 09:26:29.560993 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80dad6a1_700f_4953_88e2_edc17468af14.slice/crio-339c92f1f2459dd07c13ff3b29aa31e6d5f62e666e189d4e9fff298b2a7e288f WatchSource:0}: Error finding container 339c92f1f2459dd07c13ff3b29aa31e6d5f62e666e189d4e9fff298b2a7e288f: Status 404 returned error can't find the container with id 339c92f1f2459dd07c13ff3b29aa31e6d5f62e666e189d4e9fff298b2a7e288f Mar 19 09:26:30.464502 master-0 kubenswrapper[15202]: I0319 09:26:30.464359 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" event={"ID":"80dad6a1-700f-4953-88e2-edc17468af14","Type":"ContainerStarted","Data":"8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407"} Mar 19 09:26:30.464502 master-0 kubenswrapper[15202]: I0319 09:26:30.464415 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" event={"ID":"80dad6a1-700f-4953-88e2-edc17468af14","Type":"ContainerStarted","Data":"339c92f1f2459dd07c13ff3b29aa31e6d5f62e666e189d4e9fff298b2a7e288f"} Mar 19 09:26:30.465591 master-0 kubenswrapper[15202]: I0319 09:26:30.464574 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:30.465831 master-0 kubenswrapper[15202]: I0319 09:26:30.465774 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/3.log" Mar 19 09:26:30.466079 master-0 kubenswrapper[15202]: I0319 09:26:30.466024 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:30.470741 master-0 kubenswrapper[15202]: I0319 09:26:30.470704 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:26:30.472516 master-0 kubenswrapper[15202]: I0319 09:26:30.472452 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:26:30.487250 master-0 kubenswrapper[15202]: I0319 09:26:30.487169 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" podStartSLOduration=5.487149039 podStartE2EDuration="5.487149039s" podCreationTimestamp="2026-03-19 09:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:30.485128996 +0000 UTC m=+107.870543832" watchObservedRunningTime="2026-03-19 09:26:30.487149039 +0000 UTC m=+107.872563865" Mar 19 09:26:34.570419 master-0 kubenswrapper[15202]: I0319 09:26:34.570345 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:26:34.571330 master-0 kubenswrapper[15202]: I0319 09:26:34.570840 15202 scope.go:117] "RemoveContainer" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" Mar 19 09:26:34.571330 master-0 kubenswrapper[15202]: E0319 09:26:34.571016 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:34.571521 master-0 kubenswrapper[15202]: I0319 09:26:34.571389 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:26:35.507447 master-0 kubenswrapper[15202]: I0319 09:26:35.507372 15202 scope.go:117] "RemoveContainer" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" Mar 19 09:26:35.507869 master-0 kubenswrapper[15202]: E0319 09:26:35.507629 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:37.787876 master-0 kubenswrapper[15202]: I0319 09:26:37.787805 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q"] Mar 19 09:26:37.788919 master-0 kubenswrapper[15202]: I0319 09:26:37.788896 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:37.791137 master-0 kubenswrapper[15202]: I0319 09:26:37.791097 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:26:37.839844 master-0 kubenswrapper[15202]: I0319 09:26:37.839767 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q"] Mar 19 09:26:37.962499 master-0 kubenswrapper[15202]: I0319 09:26:37.961459 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4d967639-8dbe-4d6d-bb35-6426fdeddefa-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:37.962499 master-0 kubenswrapper[15202]: I0319 09:26:37.961671 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d967639-8dbe-4d6d-bb35-6426fdeddefa-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:37.962499 master-0 kubenswrapper[15202]: I0319 09:26:37.961717 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nrnt\" (UniqueName: \"kubernetes.io/projected/4d967639-8dbe-4d6d-bb35-6426fdeddefa-kube-api-access-8nrnt\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.063427 master-0 kubenswrapper[15202]: I0319 09:26:38.063301 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d967639-8dbe-4d6d-bb35-6426fdeddefa-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.063427 master-0 kubenswrapper[15202]: I0319 09:26:38.063398 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nrnt\" (UniqueName: \"kubernetes.io/projected/4d967639-8dbe-4d6d-bb35-6426fdeddefa-kube-api-access-8nrnt\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.063427 master-0 kubenswrapper[15202]: I0319 09:26:38.063427 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4d967639-8dbe-4d6d-bb35-6426fdeddefa-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.064457 master-0 kubenswrapper[15202]: I0319 09:26:38.064426 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4d967639-8dbe-4d6d-bb35-6426fdeddefa-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.067150 master-0 kubenswrapper[15202]: I0319 09:26:38.067092 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d967639-8dbe-4d6d-bb35-6426fdeddefa-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.085040 master-0 kubenswrapper[15202]: I0319 09:26:38.084976 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nrnt\" (UniqueName: \"kubernetes.io/projected/4d967639-8dbe-4d6d-bb35-6426fdeddefa-kube-api-access-8nrnt\") pod \"machine-config-controller-b4f87c5b9-ljq8q\" (UID: \"4d967639-8dbe-4d6d-bb35-6426fdeddefa\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.102569 master-0 kubenswrapper[15202]: I0319 09:26:38.102517 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" Mar 19 09:26:38.705412 master-0 kubenswrapper[15202]: I0319 09:26:38.705326 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q"] Mar 19 09:26:38.710919 master-0 kubenswrapper[15202]: W0319 09:26:38.710834 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d967639_8dbe_4d6d_bb35_6426fdeddefa.slice/crio-2726bf80f19369da1fca7c62f153580f0db99d96433560eee242813e92f936c6 WatchSource:0}: Error finding container 2726bf80f19369da1fca7c62f153580f0db99d96433560eee242813e92f936c6: Status 404 returned error can't find the container with id 2726bf80f19369da1fca7c62f153580f0db99d96433560eee242813e92f936c6 Mar 19 09:26:39.110294 master-0 kubenswrapper[15202]: I0319 09:26:39.110251 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dcf5569b5-4cst9"] Mar 19 09:26:39.111546 master-0 kubenswrapper[15202]: I0319 09:26:39.111527 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.116897 master-0 kubenswrapper[15202]: I0319 09:26:39.116860 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:26:39.116972 master-0 kubenswrapper[15202]: I0319 09:26:39.116894 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:26:39.117234 master-0 kubenswrapper[15202]: I0319 09:26:39.117220 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:26:39.117508 master-0 kubenswrapper[15202]: I0319 09:26:39.117454 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:26:39.117600 master-0 kubenswrapper[15202]: I0319 09:26:39.117576 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:26:39.117860 master-0 kubenswrapper[15202]: I0319 09:26:39.117831 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:26:39.128128 master-0 kubenswrapper[15202]: I0319 09:26:39.125032 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279"] Mar 19 09:26:39.128128 master-0 kubenswrapper[15202]: I0319 09:26:39.125701 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk"] Mar 19 09:26:39.128128 master-0 kubenswrapper[15202]: I0319 09:26:39.126165 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:39.128128 master-0 kubenswrapper[15202]: I0319 09:26:39.126183 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" Mar 19 09:26:39.131182 master-0 kubenswrapper[15202]: I0319 09:26:39.131130 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-dtscf" Mar 19 09:26:39.131256 master-0 kubenswrapper[15202]: I0319 09:26:39.131176 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:26:39.155230 master-0 kubenswrapper[15202]: I0319 09:26:39.153173 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gmjrw"] Mar 19 09:26:39.155230 master-0 kubenswrapper[15202]: I0319 09:26:39.154001 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk"] Mar 19 09:26:39.155230 master-0 kubenswrapper[15202]: I0319 09:26:39.154094 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.156918 master-0 kubenswrapper[15202]: I0319 09:26:39.156885 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:26:39.157170 master-0 kubenswrapper[15202]: I0319 09:26:39.157152 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:26:39.157300 master-0 kubenswrapper[15202]: I0319 09:26:39.157278 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:26:39.181767 master-0 kubenswrapper[15202]: I0319 09:26:39.181109 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279"] Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194223 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzk5\" (UniqueName: \"kubernetes.io/projected/dff4eb24-47ac-46be-bf3d-d939bd739b52-kube-api-access-tpzk5\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194281 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-stats-auth\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194304 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc9rl\" (UniqueName: \"kubernetes.io/projected/5ce57500-da52-4d24-8fa6-868dae9a6932-kube-api-access-jc9rl\") pod \"ingress-canary-gmjrw\" (UID: \"5ce57500-da52-4d24-8fa6-868dae9a6932\") " pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194337 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-default-certificate\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194357 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ce57500-da52-4d24-8fa6-868dae9a6932-cert\") pod \"ingress-canary-gmjrw\" (UID: \"5ce57500-da52-4d24-8fa6-868dae9a6932\") " pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194374 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-metrics-certs\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194391 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkrd\" (UniqueName: \"kubernetes.io/projected/95757d65-298d-444f-95c6-ab809c291906-kube-api-access-9lkrd\") pod \"network-check-source-b4bf74f6-wqvfk\" (UID: \"95757d65-298d-444f-95c6-ab809c291906\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194410 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bb257b60-2a5e-4261-95da-2a641421ec4f-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-l2279\" (UID: \"bb257b60-2a5e-4261-95da-2a641421ec4f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:39.194585 master-0 kubenswrapper[15202]: I0319 09:26:39.194432 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff4eb24-47ac-46be-bf3d-d939bd739b52-service-ca-bundle\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.226493 master-0 kubenswrapper[15202]: I0319 09:26:39.222484 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gmjrw"] Mar 19 09:26:39.247496 master-0 kubenswrapper[15202]: I0319 09:26:39.247357 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6mbkc"] Mar 19 09:26:39.252179 master-0 kubenswrapper[15202]: I0319 09:26:39.248302 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.252179 master-0 kubenswrapper[15202]: I0319 09:26:39.250729 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jvr7z" Mar 19 09:26:39.254029 master-0 kubenswrapper[15202]: I0319 09:26:39.253094 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 09:26:39.295525 master-0 kubenswrapper[15202]: I0319 09:26:39.295459 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzk5\" (UniqueName: \"kubernetes.io/projected/dff4eb24-47ac-46be-bf3d-d939bd739b52-kube-api-access-tpzk5\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295536 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-stats-auth\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295567 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc9rl\" (UniqueName: \"kubernetes.io/projected/5ce57500-da52-4d24-8fa6-868dae9a6932-kube-api-access-jc9rl\") pod \"ingress-canary-gmjrw\" (UID: \"5ce57500-da52-4d24-8fa6-868dae9a6932\") " pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295614 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-default-certificate\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295646 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ce57500-da52-4d24-8fa6-868dae9a6932-cert\") pod \"ingress-canary-gmjrw\" (UID: \"5ce57500-da52-4d24-8fa6-868dae9a6932\") " pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295672 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-metrics-certs\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295696 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkrd\" (UniqueName: \"kubernetes.io/projected/95757d65-298d-444f-95c6-ab809c291906-kube-api-access-9lkrd\") pod \"network-check-source-b4bf74f6-wqvfk\" (UID: \"95757d65-298d-444f-95c6-ab809c291906\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295719 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bb257b60-2a5e-4261-95da-2a641421ec4f-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-l2279\" (UID: \"bb257b60-2a5e-4261-95da-2a641421ec4f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:39.295760 master-0 kubenswrapper[15202]: I0319 09:26:39.295745 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff4eb24-47ac-46be-bf3d-d939bd739b52-service-ca-bundle\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.297603 master-0 kubenswrapper[15202]: I0319 09:26:39.297576 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff4eb24-47ac-46be-bf3d-d939bd739b52-service-ca-bundle\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.300369 master-0 kubenswrapper[15202]: I0319 09:26:39.299316 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-stats-auth\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.300631 master-0 kubenswrapper[15202]: I0319 09:26:39.300500 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-metrics-certs\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.317880 master-0 kubenswrapper[15202]: I0319 09:26:39.301096 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dff4eb24-47ac-46be-bf3d-d939bd739b52-default-certificate\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.317880 master-0 kubenswrapper[15202]: I0319 09:26:39.301419 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ce57500-da52-4d24-8fa6-868dae9a6932-cert\") pod \"ingress-canary-gmjrw\" (UID: \"5ce57500-da52-4d24-8fa6-868dae9a6932\") " pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.317880 master-0 kubenswrapper[15202]: I0319 09:26:39.302010 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bb257b60-2a5e-4261-95da-2a641421ec4f-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-l2279\" (UID: \"bb257b60-2a5e-4261-95da-2a641421ec4f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:39.319606 master-0 kubenswrapper[15202]: I0319 09:26:39.319194 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkrd\" (UniqueName: \"kubernetes.io/projected/95757d65-298d-444f-95c6-ab809c291906-kube-api-access-9lkrd\") pod \"network-check-source-b4bf74f6-wqvfk\" (UID: \"95757d65-298d-444f-95c6-ab809c291906\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" Mar 19 09:26:39.322784 master-0 kubenswrapper[15202]: I0319 09:26:39.321812 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzk5\" (UniqueName: \"kubernetes.io/projected/dff4eb24-47ac-46be-bf3d-d939bd739b52-kube-api-access-tpzk5\") pod \"router-default-7dcf5569b5-4cst9\" (UID: \"dff4eb24-47ac-46be-bf3d-d939bd739b52\") " pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.327411 master-0 kubenswrapper[15202]: I0319 09:26:39.327361 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc9rl\" (UniqueName: \"kubernetes.io/projected/5ce57500-da52-4d24-8fa6-868dae9a6932-kube-api-access-jc9rl\") pod \"ingress-canary-gmjrw\" (UID: \"5ce57500-da52-4d24-8fa6-868dae9a6932\") " pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.398540 master-0 kubenswrapper[15202]: I0319 09:26:39.398453 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxprd\" (UniqueName: \"kubernetes.io/projected/9d86050e-430b-4bba-a7ed-e9e1378fae61-kube-api-access-sxprd\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.398743 master-0 kubenswrapper[15202]: I0319 09:26:39.398584 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d86050e-430b-4bba-a7ed-e9e1378fae61-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.398743 master-0 kubenswrapper[15202]: I0319 09:26:39.398615 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9d86050e-430b-4bba-a7ed-e9e1378fae61-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.398743 master-0 kubenswrapper[15202]: I0319 09:26:39.398667 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/9d86050e-430b-4bba-a7ed-e9e1378fae61-ready\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.436320 master-0 kubenswrapper[15202]: I0319 09:26:39.436258 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:39.457167 master-0 kubenswrapper[15202]: W0319 09:26:39.456688 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff4eb24_47ac_46be_bf3d_d939bd739b52.slice/crio-992051c48556a77ee375fb7cfa082005ddb97067466696fc6521acae3a2894a3 WatchSource:0}: Error finding container 992051c48556a77ee375fb7cfa082005ddb97067466696fc6521acae3a2894a3: Status 404 returned error can't find the container with id 992051c48556a77ee375fb7cfa082005ddb97067466696fc6521acae3a2894a3 Mar 19 09:26:39.457571 master-0 kubenswrapper[15202]: I0319 09:26:39.457540 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:39.471890 master-0 kubenswrapper[15202]: I0319 09:26:39.470311 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" Mar 19 09:26:39.494279 master-0 kubenswrapper[15202]: I0319 09:26:39.494188 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gmjrw" Mar 19 09:26:39.500504 master-0 kubenswrapper[15202]: I0319 09:26:39.500448 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d86050e-430b-4bba-a7ed-e9e1378fae61-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.500608 master-0 kubenswrapper[15202]: I0319 09:26:39.500511 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9d86050e-430b-4bba-a7ed-e9e1378fae61-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.500608 master-0 kubenswrapper[15202]: I0319 09:26:39.500551 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/9d86050e-430b-4bba-a7ed-e9e1378fae61-ready\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.500608 master-0 kubenswrapper[15202]: I0319 09:26:39.500589 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxprd\" (UniqueName: \"kubernetes.io/projected/9d86050e-430b-4bba-a7ed-e9e1378fae61-kube-api-access-sxprd\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.500941 master-0 kubenswrapper[15202]: I0319 09:26:39.500922 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d86050e-430b-4bba-a7ed-e9e1378fae61-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.501728 master-0 kubenswrapper[15202]: I0319 09:26:39.501514 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9d86050e-430b-4bba-a7ed-e9e1378fae61-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.501847 master-0 kubenswrapper[15202]: I0319 09:26:39.501742 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/9d86050e-430b-4bba-a7ed-e9e1378fae61-ready\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.520107 master-0 kubenswrapper[15202]: I0319 09:26:39.518959 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxprd\" (UniqueName: \"kubernetes.io/projected/9d86050e-430b-4bba-a7ed-e9e1378fae61-kube-api-access-sxprd\") pod \"cni-sysctl-allowlist-ds-6mbkc\" (UID: \"9d86050e-430b-4bba-a7ed-e9e1378fae61\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.553407 master-0 kubenswrapper[15202]: I0319 09:26:39.553341 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" event={"ID":"4d967639-8dbe-4d6d-bb35-6426fdeddefa","Type":"ContainerStarted","Data":"6f9581a76169352b7c29e5bb0b21a55332e0d66896a42776006f867931a04e95"} Mar 19 09:26:39.553407 master-0 kubenswrapper[15202]: I0319 09:26:39.553394 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" event={"ID":"4d967639-8dbe-4d6d-bb35-6426fdeddefa","Type":"ContainerStarted","Data":"1a74237b549ae9d4b8dd6c8544555a0732e48cbaa81bb9f8707806c36260c2dd"} Mar 19 09:26:39.553407 master-0 kubenswrapper[15202]: I0319 09:26:39.553408 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" event={"ID":"4d967639-8dbe-4d6d-bb35-6426fdeddefa","Type":"ContainerStarted","Data":"2726bf80f19369da1fca7c62f153580f0db99d96433560eee242813e92f936c6"} Mar 19 09:26:39.554686 master-0 kubenswrapper[15202]: I0319 09:26:39.554649 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" event={"ID":"dff4eb24-47ac-46be-bf3d-d939bd739b52","Type":"ContainerStarted","Data":"992051c48556a77ee375fb7cfa082005ddb97067466696fc6521acae3a2894a3"} Mar 19 09:26:39.573224 master-0 kubenswrapper[15202]: I0319 09:26:39.573159 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:39.927389 master-0 kubenswrapper[15202]: I0319 09:26:39.927270 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-ljq8q" podStartSLOduration=2.92724048 podStartE2EDuration="2.92724048s" podCreationTimestamp="2026-03-19 09:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:39.615973104 +0000 UTC m=+117.001387940" watchObservedRunningTime="2026-03-19 09:26:39.92724048 +0000 UTC m=+117.312655286" Mar 19 09:26:39.947618 master-0 kubenswrapper[15202]: W0319 09:26:39.946756 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb257b60_2a5e_4261_95da_2a641421ec4f.slice/crio-91241bf460a0ce46297f578724c190c5a379e370dcd4b3e2f79b7a7efb939f6c WatchSource:0}: Error finding container 91241bf460a0ce46297f578724c190c5a379e370dcd4b3e2f79b7a7efb939f6c: Status 404 returned error can't find the container with id 91241bf460a0ce46297f578724c190c5a379e370dcd4b3e2f79b7a7efb939f6c Mar 19 09:26:39.948311 master-0 kubenswrapper[15202]: I0319 09:26:39.948251 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279"] Mar 19 09:26:40.097264 master-0 kubenswrapper[15202]: I0319 09:26:40.097219 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk"] Mar 19 09:26:40.193879 master-0 kubenswrapper[15202]: I0319 09:26:40.193832 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gmjrw"] Mar 19 09:26:40.207625 master-0 kubenswrapper[15202]: W0319 09:26:40.204995 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce57500_da52_4d24_8fa6_868dae9a6932.slice/crio-1f1525f9ded84bbb5e7e068881e1ee1b74a3c932e016de13705f41af80cbc113 WatchSource:0}: Error finding container 1f1525f9ded84bbb5e7e068881e1ee1b74a3c932e016de13705f41af80cbc113: Status 404 returned error can't find the container with id 1f1525f9ded84bbb5e7e068881e1ee1b74a3c932e016de13705f41af80cbc113 Mar 19 09:26:40.564494 master-0 kubenswrapper[15202]: I0319 09:26:40.563717 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" event={"ID":"9d86050e-430b-4bba-a7ed-e9e1378fae61","Type":"ContainerStarted","Data":"1853b39fe92440fde3412164b8c73787fc42d93a3fe0df8c0c9c242ed18536ca"} Mar 19 09:26:40.564494 master-0 kubenswrapper[15202]: I0319 09:26:40.563766 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" event={"ID":"9d86050e-430b-4bba-a7ed-e9e1378fae61","Type":"ContainerStarted","Data":"a53b0e4b835c06a17e23d3277b6a8a639c445dd183974955c7848a72538cbe34"} Mar 19 09:26:40.564795 master-0 kubenswrapper[15202]: I0319 09:26:40.564546 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:40.570481 master-0 kubenswrapper[15202]: I0319 09:26:40.565457 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" event={"ID":"bb257b60-2a5e-4261-95da-2a641421ec4f","Type":"ContainerStarted","Data":"91241bf460a0ce46297f578724c190c5a379e370dcd4b3e2f79b7a7efb939f6c"} Mar 19 09:26:40.570481 master-0 kubenswrapper[15202]: I0319 09:26:40.567144 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gmjrw" event={"ID":"5ce57500-da52-4d24-8fa6-868dae9a6932","Type":"ContainerStarted","Data":"31280de770f4339ba3264d09c6b8c6ecc839e3aa9baf6b2c0810a1d0e0733f7b"} Mar 19 09:26:40.570481 master-0 kubenswrapper[15202]: I0319 09:26:40.567165 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gmjrw" event={"ID":"5ce57500-da52-4d24-8fa6-868dae9a6932","Type":"ContainerStarted","Data":"1f1525f9ded84bbb5e7e068881e1ee1b74a3c932e016de13705f41af80cbc113"} Mar 19 09:26:40.570481 master-0 kubenswrapper[15202]: I0319 09:26:40.570251 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" event={"ID":"95757d65-298d-444f-95c6-ab809c291906","Type":"ContainerStarted","Data":"5f08bb2aac83d32d2e385232ddd1bb1cf74432d37fb7ff8b563f30de66229809"} Mar 19 09:26:40.570481 master-0 kubenswrapper[15202]: I0319 09:26:40.570276 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" event={"ID":"95757d65-298d-444f-95c6-ab809c291906","Type":"ContainerStarted","Data":"4020b234eebc307665dbbfd90bf5a4b56716186903ca4bf7c646f90fc7279712"} Mar 19 09:26:40.586492 master-0 kubenswrapper[15202]: I0319 09:26:40.585820 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" podStartSLOduration=1.585803394 podStartE2EDuration="1.585803394s" podCreationTimestamp="2026-03-19 09:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:40.585179688 +0000 UTC m=+117.970594504" watchObservedRunningTime="2026-03-19 09:26:40.585803394 +0000 UTC m=+117.971218210" Mar 19 09:26:40.659568 master-0 kubenswrapper[15202]: I0319 09:26:40.654595 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gmjrw" podStartSLOduration=1.654567901 podStartE2EDuration="1.654567901s" podCreationTimestamp="2026-03-19 09:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:40.645091447 +0000 UTC m=+118.030506263" watchObservedRunningTime="2026-03-19 09:26:40.654567901 +0000 UTC m=+118.039982717" Mar 19 09:26:40.659568 master-0 kubenswrapper[15202]: I0319 09:26:40.654954 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-wqvfk" podStartSLOduration=504.654944361 podStartE2EDuration="8m24.654944361s" podCreationTimestamp="2026-03-19 09:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:40.620737927 +0000 UTC m=+118.006152743" watchObservedRunningTime="2026-03-19 09:26:40.654944361 +0000 UTC m=+118.040359177" Mar 19 09:26:41.605677 master-0 kubenswrapper[15202]: I0319 09:26:41.605632 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-6mbkc" Mar 19 09:26:43.588786 master-0 kubenswrapper[15202]: I0319 09:26:43.588733 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" event={"ID":"dff4eb24-47ac-46be-bf3d-d939bd739b52","Type":"ContainerStarted","Data":"7f9d92c78a52c6aa66316437c8663c3c71d601d8cf00f52122830885a8099b99"} Mar 19 09:26:43.589993 master-0 kubenswrapper[15202]: I0319 09:26:43.589967 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" event={"ID":"bb257b60-2a5e-4261-95da-2a641421ec4f","Type":"ContainerStarted","Data":"b35d467c2325dd67c4275b86f24975b3edb8c2b53c1e79b3fe1bc568884f8b18"} Mar 19 09:26:43.590216 master-0 kubenswrapper[15202]: I0319 09:26:43.590195 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:43.595437 master-0 kubenswrapper[15202]: I0319 09:26:43.595366 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" Mar 19 09:26:43.831661 master-0 kubenswrapper[15202]: I0319 09:26:43.831573 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" podStartSLOduration=338.265235987 podStartE2EDuration="5m41.831553225s" podCreationTimestamp="2026-03-19 09:21:02 +0000 UTC" firstStartedPulling="2026-03-19 09:26:39.458529554 +0000 UTC m=+116.843944370" lastFinishedPulling="2026-03-19 09:26:43.024846792 +0000 UTC m=+120.410261608" observedRunningTime="2026-03-19 09:26:43.828190518 +0000 UTC m=+121.213605334" watchObservedRunningTime="2026-03-19 09:26:43.831553225 +0000 UTC m=+121.216968031" Mar 19 09:26:44.437378 master-0 kubenswrapper[15202]: I0319 09:26:44.437272 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:44.441168 master-0 kubenswrapper[15202]: I0319 09:26:44.441123 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:44.595592 master-0 kubenswrapper[15202]: I0319 09:26:44.595508 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:44.600307 master-0 kubenswrapper[15202]: I0319 09:26:44.600227 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dcf5569b5-4cst9" Mar 19 09:26:45.537704 master-0 kubenswrapper[15202]: I0319 09:26:45.537606 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-l2279" podStartSLOduration=174.467417182 podStartE2EDuration="2m57.537582835s" podCreationTimestamp="2026-03-19 09:23:48 +0000 UTC" firstStartedPulling="2026-03-19 09:26:39.955957632 +0000 UTC m=+117.341372448" lastFinishedPulling="2026-03-19 09:26:43.026123285 +0000 UTC m=+120.411538101" observedRunningTime="2026-03-19 09:26:44.00506113 +0000 UTC m=+121.390475956" watchObservedRunningTime="2026-03-19 09:26:45.537582835 +0000 UTC m=+122.922997671" Mar 19 09:26:48.247366 master-0 kubenswrapper[15202]: I0319 09:26:48.247285 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm"] Mar 19 09:26:48.248339 master-0 kubenswrapper[15202]: I0319 09:26:48.248312 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.250409 master-0 kubenswrapper[15202]: I0319 09:26:48.250344 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:26:48.250688 master-0 kubenswrapper[15202]: I0319 09:26:48.250664 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:26:48.251514 master-0 kubenswrapper[15202]: I0319 09:26:48.251461 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:26:48.252103 master-0 kubenswrapper[15202]: I0319 09:26:48.252075 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-lvzxr" Mar 19 09:26:48.288844 master-0 kubenswrapper[15202]: I0319 09:26:48.288743 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f648c39-05f1-4e84-bc49-235ce3be3286-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.288844 master-0 kubenswrapper[15202]: I0319 09:26:48.288825 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f648c39-05f1-4e84-bc49-235ce3be3286-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.289109 master-0 kubenswrapper[15202]: I0319 09:26:48.288943 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f648c39-05f1-4e84-bc49-235ce3be3286-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.289109 master-0 kubenswrapper[15202]: I0319 09:26:48.288976 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98g7t\" (UniqueName: \"kubernetes.io/projected/6f648c39-05f1-4e84-bc49-235ce3be3286-kube-api-access-98g7t\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.390895 master-0 kubenswrapper[15202]: I0319 09:26:48.390824 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f648c39-05f1-4e84-bc49-235ce3be3286-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.391097 master-0 kubenswrapper[15202]: I0319 09:26:48.390919 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98g7t\" (UniqueName: \"kubernetes.io/projected/6f648c39-05f1-4e84-bc49-235ce3be3286-kube-api-access-98g7t\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.391097 master-0 kubenswrapper[15202]: I0319 09:26:48.390997 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f648c39-05f1-4e84-bc49-235ce3be3286-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.391505 master-0 kubenswrapper[15202]: I0319 09:26:48.391379 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f648c39-05f1-4e84-bc49-235ce3be3286-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.391659 master-0 kubenswrapper[15202]: I0319 09:26:48.391598 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm"] Mar 19 09:26:48.392260 master-0 kubenswrapper[15202]: I0319 09:26:48.392217 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f648c39-05f1-4e84-bc49-235ce3be3286-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.396101 master-0 kubenswrapper[15202]: I0319 09:26:48.396067 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f648c39-05f1-4e84-bc49-235ce3be3286-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:48.396278 master-0 kubenswrapper[15202]: I0319 09:26:48.396227 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f648c39-05f1-4e84-bc49-235ce3be3286-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:49.798947 master-0 kubenswrapper[15202]: I0319 09:26:49.798889 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98g7t\" (UniqueName: \"kubernetes.io/projected/6f648c39-05f1-4e84-bc49-235ce3be3286-kube-api-access-98g7t\") pod \"prometheus-operator-6c8df6d4b-6xvjm\" (UID: \"6f648c39-05f1-4e84-bc49-235ce3be3286\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:49.812549 master-0 kubenswrapper[15202]: I0319 09:26:49.812488 15202 scope.go:117] "RemoveContainer" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" Mar 19 09:26:49.812786 master-0 kubenswrapper[15202]: E0319 09:26:49.812754 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:26:50.065117 master-0 kubenswrapper[15202]: I0319 09:26:50.064840 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" Mar 19 09:26:50.918539 master-0 kubenswrapper[15202]: I0319 09:26:50.918490 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm"] Mar 19 09:26:51.139184 master-0 kubenswrapper[15202]: I0319 09:26:51.139091 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-967b7967b-mb725"] Mar 19 09:26:51.140662 master-0 kubenswrapper[15202]: I0319 09:26:51.140621 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.144256 master-0 kubenswrapper[15202]: I0319 09:26:51.144195 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:26:51.144635 master-0 kubenswrapper[15202]: I0319 09:26:51.144591 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:26:51.144752 master-0 kubenswrapper[15202]: I0319 09:26:51.144719 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-2qc5w" Mar 19 09:26:51.145167 master-0 kubenswrapper[15202]: I0319 09:26:51.145142 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:26:51.145255 master-0 kubenswrapper[15202]: I0319 09:26:51.145225 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:26:51.145884 master-0 kubenswrapper[15202]: I0319 09:26:51.145853 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:26:51.146259 master-0 kubenswrapper[15202]: I0319 09:26:51.146220 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:26:51.146452 master-0 kubenswrapper[15202]: I0319 09:26:51.146425 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:26:51.146998 master-0 kubenswrapper[15202]: I0319 09:26:51.146960 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:26:51.147414 master-0 kubenswrapper[15202]: I0319 09:26:51.147385 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:26:51.147974 master-0 kubenswrapper[15202]: I0319 09:26:51.147918 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:26:51.148265 master-0 kubenswrapper[15202]: I0319 09:26:51.148212 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:26:51.158418 master-0 kubenswrapper[15202]: I0319 09:26:51.158355 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:26:51.163942 master-0 kubenswrapper[15202]: I0319 09:26:51.163893 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:26:51.234845 master-0 kubenswrapper[15202]: I0319 09:26:51.234754 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.234845 master-0 kubenswrapper[15202]: I0319 09:26:51.234826 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.234845 master-0 kubenswrapper[15202]: I0319 09:26:51.234863 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.235398 master-0 kubenswrapper[15202]: I0319 09:26:51.235106 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pphhm\" (UniqueName: \"kubernetes.io/projected/c656b2f4-785b-4403-b56a-637656900f07-kube-api-access-pphhm\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.235398 master-0 kubenswrapper[15202]: I0319 09:26:51.235275 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-serving-cert\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.235398 master-0 kubenswrapper[15202]: I0319 09:26:51.235363 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-error\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.235853 master-0 kubenswrapper[15202]: I0319 09:26:51.235798 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-cliconfig\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.236019 master-0 kubenswrapper[15202]: I0319 09:26:51.235949 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-session\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.236249 master-0 kubenswrapper[15202]: I0319 09:26:51.236167 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-audit-policies\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.236318 master-0 kubenswrapper[15202]: I0319 09:26:51.236264 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-router-certs\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.236395 master-0 kubenswrapper[15202]: I0319 09:26:51.236363 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-login\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.236451 master-0 kubenswrapper[15202]: I0319 09:26:51.236411 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c656b2f4-785b-4403-b56a-637656900f07-audit-dir\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.236573 master-0 kubenswrapper[15202]: I0319 09:26:51.236538 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-service-ca\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.337564 master-0 kubenswrapper[15202]: I0319 09:26:51.337417 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pphhm\" (UniqueName: \"kubernetes.io/projected/c656b2f4-785b-4403-b56a-637656900f07-kube-api-access-pphhm\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.337897 master-0 kubenswrapper[15202]: I0319 09:26:51.337656 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-serving-cert\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.337897 master-0 kubenswrapper[15202]: I0319 09:26:51.337707 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-error\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338039 master-0 kubenswrapper[15202]: I0319 09:26:51.337970 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-cliconfig\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338039 master-0 kubenswrapper[15202]: I0319 09:26:51.338007 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-session\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338039 master-0 kubenswrapper[15202]: I0319 09:26:51.338039 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-audit-policies\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338179 master-0 kubenswrapper[15202]: I0319 09:26:51.338062 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-router-certs\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338179 master-0 kubenswrapper[15202]: I0319 09:26:51.338089 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-login\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338179 master-0 kubenswrapper[15202]: I0319 09:26:51.338107 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c656b2f4-785b-4403-b56a-637656900f07-audit-dir\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.338179 master-0 kubenswrapper[15202]: I0319 09:26:51.338128 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-service-ca\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339169 master-0 kubenswrapper[15202]: I0319 09:26:51.338572 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339169 master-0 kubenswrapper[15202]: I0319 09:26:51.338613 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339169 master-0 kubenswrapper[15202]: I0319 09:26:51.338643 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339169 master-0 kubenswrapper[15202]: I0319 09:26:51.338655 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-cliconfig\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339169 master-0 kubenswrapper[15202]: I0319 09:26:51.338737 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c656b2f4-785b-4403-b56a-637656900f07-audit-dir\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339580 master-0 kubenswrapper[15202]: I0319 09:26:51.339548 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-service-ca\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339659 master-0 kubenswrapper[15202]: I0319 09:26:51.339605 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-audit-policies\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.339803 master-0 kubenswrapper[15202]: I0319 09:26:51.339747 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.341559 master-0 kubenswrapper[15202]: I0319 09:26:51.341511 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-serving-cert\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.342155 master-0 kubenswrapper[15202]: I0319 09:26:51.342095 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.343820 master-0 kubenswrapper[15202]: I0319 09:26:51.343710 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-login\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.343820 master-0 kubenswrapper[15202]: I0319 09:26:51.343727 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-router-certs\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.344380 master-0 kubenswrapper[15202]: I0319 09:26:51.344340 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-error\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.345445 master-0 kubenswrapper[15202]: I0319 09:26:51.345391 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-session\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.346713 master-0 kubenswrapper[15202]: I0319 09:26:51.346601 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:51.634095 master-0 kubenswrapper[15202]: I0319 09:26:51.633917 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" event={"ID":"6f648c39-05f1-4e84-bc49-235ce3be3286","Type":"ContainerStarted","Data":"3a905e1757b9ef112722a4561bad2c470319e729a6eb7242d858cbe47b04fa1f"} Mar 19 09:26:52.407356 master-0 kubenswrapper[15202]: I0319 09:26:52.407292 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-967b7967b-mb725"] Mar 19 09:26:52.735042 master-0 kubenswrapper[15202]: I0319 09:26:52.734969 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pphhm\" (UniqueName: \"kubernetes.io/projected/c656b2f4-785b-4403-b56a-637656900f07-kube-api-access-pphhm\") pod \"oauth-openshift-967b7967b-mb725\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:52.969933 master-0 kubenswrapper[15202]: I0319 09:26:52.969885 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:53.352425 master-0 kubenswrapper[15202]: I0319 09:26:53.352287 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-967b7967b-mb725"] Mar 19 09:26:53.370008 master-0 kubenswrapper[15202]: W0319 09:26:53.369932 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc656b2f4_785b_4403_b56a_637656900f07.slice/crio-525b8c67ee6f67dd3a96af91aefd1b8871e8d6e92ee92d0c0fe811c2fe8afb8a WatchSource:0}: Error finding container 525b8c67ee6f67dd3a96af91aefd1b8871e8d6e92ee92d0c0fe811c2fe8afb8a: Status 404 returned error can't find the container with id 525b8c67ee6f67dd3a96af91aefd1b8871e8d6e92ee92d0c0fe811c2fe8afb8a Mar 19 09:26:53.660648 master-0 kubenswrapper[15202]: I0319 09:26:53.658799 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" event={"ID":"c656b2f4-785b-4403-b56a-637656900f07","Type":"ContainerStarted","Data":"525b8c67ee6f67dd3a96af91aefd1b8871e8d6e92ee92d0c0fe811c2fe8afb8a"} Mar 19 09:26:56.502200 master-0 kubenswrapper[15202]: I0319 09:26:56.498574 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-967b7967b-mb725"] Mar 19 09:26:56.699468 master-0 kubenswrapper[15202]: I0319 09:26:56.699374 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" event={"ID":"6f648c39-05f1-4e84-bc49-235ce3be3286","Type":"ContainerStarted","Data":"e503f697567e016dffa5a54ecb307eeb1410fdd148e784d21155784a8a3502b9"} Mar 19 09:26:56.699468 master-0 kubenswrapper[15202]: I0319 09:26:56.699461 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" event={"ID":"6f648c39-05f1-4e84-bc49-235ce3be3286","Type":"ContainerStarted","Data":"89c5ba02867429e28efc2ebd2c2c703a1e05f06769a9817e8e4f52afe04ce757"} Mar 19 09:26:56.743157 master-0 kubenswrapper[15202]: I0319 09:26:56.743014 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-6xvjm" podStartSLOduration=3.6371049170000003 podStartE2EDuration="8.742977464s" podCreationTimestamp="2026-03-19 09:26:48 +0000 UTC" firstStartedPulling="2026-03-19 09:26:50.938256455 +0000 UTC m=+128.323671301" lastFinishedPulling="2026-03-19 09:26:56.044129032 +0000 UTC m=+133.429543848" observedRunningTime="2026-03-19 09:26:56.741328482 +0000 UTC m=+134.126743328" watchObservedRunningTime="2026-03-19 09:26:56.742977464 +0000 UTC m=+134.128392280" Mar 19 09:26:56.942419 master-0 kubenswrapper[15202]: I0319 09:26:56.942352 15202 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:26:57.709665 master-0 kubenswrapper[15202]: I0319 09:26:57.709582 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" event={"ID":"c656b2f4-785b-4403-b56a-637656900f07","Type":"ContainerStarted","Data":"618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00"} Mar 19 09:26:57.711133 master-0 kubenswrapper[15202]: I0319 09:26:57.710737 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:57.715389 master-0 kubenswrapper[15202]: I0319 09:26:57.715319 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:26:58.035945 master-0 kubenswrapper[15202]: I0319 09:26:58.035745 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" podStartSLOduration=5.05905639 podStartE2EDuration="8.035717385s" podCreationTimestamp="2026-03-19 09:26:50 +0000 UTC" firstStartedPulling="2026-03-19 09:26:53.380953757 +0000 UTC m=+130.766368563" lastFinishedPulling="2026-03-19 09:26:56.357614742 +0000 UTC m=+133.743029558" observedRunningTime="2026-03-19 09:26:58.03163375 +0000 UTC m=+135.417048626" watchObservedRunningTime="2026-03-19 09:26:58.035717385 +0000 UTC m=+135.421132201" Mar 19 09:26:58.485263 master-0 kubenswrapper[15202]: I0319 09:26:58.485178 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fxzb9"] Mar 19 09:26:58.487056 master-0 kubenswrapper[15202]: I0319 09:26:58.487017 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.489630 master-0 kubenswrapper[15202]: I0319 09:26:58.489570 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:26:58.489988 master-0 kubenswrapper[15202]: I0319 09:26:58.489962 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:26:58.507292 master-0 kubenswrapper[15202]: I0319 09:26:58.507235 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk"] Mar 19 09:26:58.513926 master-0 kubenswrapper[15202]: I0319 09:26:58.513857 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.520840 master-0 kubenswrapper[15202]: I0319 09:26:58.520771 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px"] Mar 19 09:26:58.522557 master-0 kubenswrapper[15202]: I0319 09:26:58.522504 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:26:58.524030 master-0 kubenswrapper[15202]: I0319 09:26:58.523644 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:26:58.525043 master-0 kubenswrapper[15202]: I0319 09:26:58.524193 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:26:58.528547 master-0 kubenswrapper[15202]: I0319 09:26:58.528508 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.545527 master-0 kubenswrapper[15202]: I0319 09:26:58.545432 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:26:58.560836 master-0 kubenswrapper[15202]: I0319 09:26:58.560441 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:26:58.580694 master-0 kubenswrapper[15202]: I0319 09:26:58.579085 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px"] Mar 19 09:26:58.580694 master-0 kubenswrapper[15202]: I0319 09:26:58.579164 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk"] Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581332 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581403 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581429 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhjr\" (UniqueName: \"kubernetes.io/projected/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-kube-api-access-zvhjr\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581447 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90ebca14-2ef4-4875-a682-48d7cc6fdd63-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581535 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-metrics-client-ca\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581564 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9s6l\" (UniqueName: \"kubernetes.io/projected/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-kube-api-access-k9s6l\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581584 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-textfile\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581601 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90ebca14-2ef4-4875-a682-48d7cc6fdd63-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581617 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-wtmp\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581642 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgj2\" (UniqueName: \"kubernetes.io/projected/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-api-access-mqgj2\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581663 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581685 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581712 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581747 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581767 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-tls\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581792 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-sys\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581810 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-root\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.582019 master-0 kubenswrapper[15202]: I0319 09:26:58.581829 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690489 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690585 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhjr\" (UniqueName: \"kubernetes.io/projected/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-kube-api-access-zvhjr\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690611 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90ebca14-2ef4-4875-a682-48d7cc6fdd63-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690632 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-metrics-client-ca\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690661 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9s6l\" (UniqueName: \"kubernetes.io/projected/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-kube-api-access-k9s6l\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690681 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-textfile\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690699 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90ebca14-2ef4-4875-a682-48d7cc6fdd63-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690719 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-wtmp\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690750 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgj2\" (UniqueName: \"kubernetes.io/projected/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-api-access-mqgj2\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.690719 master-0 kubenswrapper[15202]: I0319 09:26:58.690773 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690796 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690826 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690886 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690910 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-tls\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690935 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-sys\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690960 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-root\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.690990 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.691493 master-0 kubenswrapper[15202]: I0319 09:26:58.691018 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.692770 master-0 kubenswrapper[15202]: E0319 09:26:58.692281 15202 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 19 09:26:58.692770 master-0 kubenswrapper[15202]: E0319 09:26:58.692353 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-tls podName:feb06c2f-79d5-4c1d-a8da-8db82de9b2f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:59.192330784 +0000 UTC m=+136.577745600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-tls") pod "openshift-state-metrics-5dc6c74576-gh4px" (UID: "feb06c2f-79d5-4c1d-a8da-8db82de9b2f9") : secret "openshift-state-metrics-tls" not found Mar 19 09:26:58.693158 master-0 kubenswrapper[15202]: I0319 09:26:58.693073 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90ebca14-2ef4-4875-a682-48d7cc6fdd63-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.694632 master-0 kubenswrapper[15202]: I0319 09:26:58.693746 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-metrics-client-ca\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.694632 master-0 kubenswrapper[15202]: I0319 09:26:58.694217 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-textfile\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.694879 master-0 kubenswrapper[15202]: I0319 09:26:58.694852 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90ebca14-2ef4-4875-a682-48d7cc6fdd63-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.695089 master-0 kubenswrapper[15202]: I0319 09:26:58.695062 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-wtmp\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.704537 master-0 kubenswrapper[15202]: I0319 09:26:58.702869 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-sys\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.704537 master-0 kubenswrapper[15202]: E0319 09:26:58.703257 15202 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 19 09:26:58.704537 master-0 kubenswrapper[15202]: E0319 09:26:58.703375 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-tls podName:90ebca14-2ef4-4875-a682-48d7cc6fdd63 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:59.203340896 +0000 UTC m=+136.588755772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-tls") pod "kube-state-metrics-7bbc969446-vjbnk" (UID: "90ebca14-2ef4-4875-a682-48d7cc6fdd63") : secret "kube-state-metrics-tls" not found Mar 19 09:26:58.704537 master-0 kubenswrapper[15202]: I0319 09:26:58.703727 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-root\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.704537 master-0 kubenswrapper[15202]: I0319 09:26:58.704169 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.704537 master-0 kubenswrapper[15202]: I0319 09:26:58.704426 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.709559 master-0 kubenswrapper[15202]: I0319 09:26:58.709503 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.713500 master-0 kubenswrapper[15202]: I0319 09:26:58.710152 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.718493 master-0 kubenswrapper[15202]: I0319 09:26:58.715795 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-node-exporter-tls\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.718493 master-0 kubenswrapper[15202]: I0319 09:26:58.717059 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9s6l\" (UniqueName: \"kubernetes.io/projected/9a8f8ced-6f9c-44ec-885d-da84f0ae27ae-kube-api-access-k9s6l\") pod \"node-exporter-fxzb9\" (UID: \"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae\") " pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:58.730154 master-0 kubenswrapper[15202]: I0319 09:26:58.724727 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhjr\" (UniqueName: \"kubernetes.io/projected/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-kube-api-access-zvhjr\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.730154 master-0 kubenswrapper[15202]: I0319 09:26:58.724993 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:58.738415 master-0 kubenswrapper[15202]: I0319 09:26:58.734445 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgj2\" (UniqueName: \"kubernetes.io/projected/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-api-access-mqgj2\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:58.814393 master-0 kubenswrapper[15202]: I0319 09:26:58.814315 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fxzb9" Mar 19 09:26:59.202171 master-0 kubenswrapper[15202]: I0319 09:26:59.202065 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:59.206569 master-0 kubenswrapper[15202]: I0319 09:26:59.206512 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb06c2f-79d5-4c1d-a8da-8db82de9b2f9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-gh4px\" (UID: \"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:59.247233 master-0 kubenswrapper[15202]: I0319 09:26:59.247142 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" Mar 19 09:26:59.309509 master-0 kubenswrapper[15202]: I0319 09:26:59.308152 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:59.314493 master-0 kubenswrapper[15202]: I0319 09:26:59.312216 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90ebca14-2ef4-4875-a682-48d7cc6fdd63-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-vjbnk\" (UID: \"90ebca14-2ef4-4875-a682-48d7cc6fdd63\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:59.416529 master-0 kubenswrapper[15202]: I0319 09:26:59.416445 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xhcnv"] Mar 19 09:26:59.417671 master-0 kubenswrapper[15202]: I0319 09:26:59.417377 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.420311 master-0 kubenswrapper[15202]: I0319 09:26:59.420261 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:26:59.421575 master-0 kubenswrapper[15202]: I0319 09:26:59.421538 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:26:59.469667 master-0 kubenswrapper[15202]: I0319 09:26:59.469491 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" Mar 19 09:26:59.512037 master-0 kubenswrapper[15202]: I0319 09:26:59.511953 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6605e538-7b11-4244-b239-22650d1f5bcb-certs\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.512509 master-0 kubenswrapper[15202]: I0319 09:26:59.512382 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dqd\" (UniqueName: \"kubernetes.io/projected/6605e538-7b11-4244-b239-22650d1f5bcb-kube-api-access-r2dqd\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.512509 master-0 kubenswrapper[15202]: I0319 09:26:59.512445 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6605e538-7b11-4244-b239-22650d1f5bcb-node-bootstrap-token\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.619586 master-0 kubenswrapper[15202]: I0319 09:26:59.619298 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6605e538-7b11-4244-b239-22650d1f5bcb-certs\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.619586 master-0 kubenswrapper[15202]: I0319 09:26:59.619439 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2dqd\" (UniqueName: \"kubernetes.io/projected/6605e538-7b11-4244-b239-22650d1f5bcb-kube-api-access-r2dqd\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.619586 master-0 kubenswrapper[15202]: I0319 09:26:59.619510 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6605e538-7b11-4244-b239-22650d1f5bcb-node-bootstrap-token\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.627559 master-0 kubenswrapper[15202]: I0319 09:26:59.627510 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6605e538-7b11-4244-b239-22650d1f5bcb-certs\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.638964 master-0 kubenswrapper[15202]: I0319 09:26:59.631065 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6605e538-7b11-4244-b239-22650d1f5bcb-node-bootstrap-token\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.657768 master-0 kubenswrapper[15202]: I0319 09:26:59.657682 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2dqd\" (UniqueName: \"kubernetes.io/projected/6605e538-7b11-4244-b239-22650d1f5bcb-kube-api-access-r2dqd\") pod \"machine-config-server-xhcnv\" (UID: \"6605e538-7b11-4244-b239-22650d1f5bcb\") " pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.711551 master-0 kubenswrapper[15202]: I0319 09:26:59.703095 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:26:59.711551 master-0 kubenswrapper[15202]: I0319 09:26:59.706830 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.714590 master-0 kubenswrapper[15202]: I0319 09:26:59.712215 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:26:59.714590 master-0 kubenswrapper[15202]: I0319 09:26:59.712588 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:26:59.714590 master-0 kubenswrapper[15202]: I0319 09:26:59.712738 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:26:59.714590 master-0 kubenswrapper[15202]: I0319 09:26:59.712864 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:26:59.714590 master-0 kubenswrapper[15202]: I0319 09:26:59.713055 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:26:59.714590 master-0 kubenswrapper[15202]: I0319 09:26:59.713206 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:26:59.715432 master-0 kubenswrapper[15202]: I0319 09:26:59.714882 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:26:59.717653 master-0 kubenswrapper[15202]: I0319 09:26:59.716186 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:26:59.745821 master-0 kubenswrapper[15202]: I0319 09:26:59.742034 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:26:59.752124 master-0 kubenswrapper[15202]: I0319 09:26:59.752054 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxzb9" event={"ID":"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae","Type":"ContainerStarted","Data":"5df48dc0839d9bcd31212229cf2d23b8779fb7578845283520790c9008e1677b"} Mar 19 09:26:59.770560 master-0 kubenswrapper[15202]: I0319 09:26:59.755488 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xhcnv" Mar 19 09:26:59.795528 master-0 kubenswrapper[15202]: I0319 09:26:59.792668 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px"] Mar 19 09:26:59.801737 master-0 kubenswrapper[15202]: W0319 09:26:59.797487 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6605e538_7b11_4244_b239_22650d1f5bcb.slice/crio-450404db22697d3448a899b0b9f0607e54d9fc8e2ab0d01541b4e710743b416f WatchSource:0}: Error finding container 450404db22697d3448a899b0b9f0607e54d9fc8e2ab0d01541b4e710743b416f: Status 404 returned error can't find the container with id 450404db22697d3448a899b0b9f0607e54d9fc8e2ab0d01541b4e710743b416f Mar 19 09:26:59.824833 master-0 kubenswrapper[15202]: W0319 09:26:59.818791 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeb06c2f_79d5_4c1d_a8da_8db82de9b2f9.slice/crio-8c55447999b4467b2bf1ce2c05df84246ed63e6b0fcbc33d3ece37fc12b51dff WatchSource:0}: Error finding container 8c55447999b4467b2bf1ce2c05df84246ed63e6b0fcbc33d3ece37fc12b51dff: Status 404 returned error can't find the container with id 8c55447999b4467b2bf1ce2c05df84246ed63e6b0fcbc33d3ece37fc12b51dff Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.826730 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.826829 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.826880 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.826916 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.826952 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827013 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-kube-api-access-s6wk6\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827054 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827083 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827134 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-web-config\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827163 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-config-out\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827248 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.828790 master-0 kubenswrapper[15202]: I0319 09:26:59.827283 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930259 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930322 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-kube-api-access-s6wk6\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930349 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930373 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930408 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-web-config\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930423 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-config-out\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930453 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930490 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930515 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930553 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930577 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.931138 master-0 kubenswrapper[15202]: I0319 09:26:59.930595 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.933489 master-0 kubenswrapper[15202]: I0319 09:26:59.932622 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.936754 master-0 kubenswrapper[15202]: I0319 09:26:59.936367 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.938186 master-0 kubenswrapper[15202]: I0319 09:26:59.937956 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.938186 master-0 kubenswrapper[15202]: I0319 09:26:59.938140 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.942519 master-0 kubenswrapper[15202]: I0319 09:26:59.942347 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.944064 master-0 kubenswrapper[15202]: I0319 09:26:59.944008 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.948052 master-0 kubenswrapper[15202]: I0319 09:26:59.945686 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.948052 master-0 kubenswrapper[15202]: I0319 09:26:59.947762 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-config-out\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.948052 master-0 kubenswrapper[15202]: I0319 09:26:59.947788 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.948052 master-0 kubenswrapper[15202]: I0319 09:26:59.947912 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.950665 master-0 kubenswrapper[15202]: I0319 09:26:59.950635 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-web-config\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.970182 master-0 kubenswrapper[15202]: I0319 09:26:59.970123 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-kube-api-access-s6wk6\") pod \"alertmanager-main-0\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:26:59.983016 master-0 kubenswrapper[15202]: I0319 09:26:59.982929 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk"] Mar 19 09:27:00.090447 master-0 kubenswrapper[15202]: I0319 09:27:00.089315 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:27:00.761306 master-0 kubenswrapper[15202]: I0319 09:27:00.761207 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" event={"ID":"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9","Type":"ContainerStarted","Data":"60557674730aae204d5041307347600fb939ced388f0dd77c61760175a574070"} Mar 19 09:27:00.761306 master-0 kubenswrapper[15202]: I0319 09:27:00.761282 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" event={"ID":"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9","Type":"ContainerStarted","Data":"d92fd481133e78f6d984bafcdb8ba8315d619e59bd8685bf79a6f779b8124978"} Mar 19 09:27:00.761306 master-0 kubenswrapper[15202]: I0319 09:27:00.761296 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" event={"ID":"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9","Type":"ContainerStarted","Data":"8c55447999b4467b2bf1ce2c05df84246ed63e6b0fcbc33d3ece37fc12b51dff"} Mar 19 09:27:00.763361 master-0 kubenswrapper[15202]: I0319 09:27:00.763311 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xhcnv" event={"ID":"6605e538-7b11-4244-b239-22650d1f5bcb","Type":"ContainerStarted","Data":"c70365f1a5da58600fa16558c40c3b624c862bb8a97a4e78a7f73992c7b0406c"} Mar 19 09:27:00.763361 master-0 kubenswrapper[15202]: I0319 09:27:00.763349 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xhcnv" event={"ID":"6605e538-7b11-4244-b239-22650d1f5bcb","Type":"ContainerStarted","Data":"450404db22697d3448a899b0b9f0607e54d9fc8e2ab0d01541b4e710743b416f"} Mar 19 09:27:00.766180 master-0 kubenswrapper[15202]: I0319 09:27:00.766140 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxzb9" event={"ID":"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae","Type":"ContainerStarted","Data":"8e35994ac4e66a273d256ca917f2061f2d406a0e2abfa9ffab775cd8f53b1042"} Mar 19 09:27:00.768357 master-0 kubenswrapper[15202]: I0319 09:27:00.768313 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" event={"ID":"90ebca14-2ef4-4875-a682-48d7cc6fdd63","Type":"ContainerStarted","Data":"301e4766733133edd5e89730132dd643ffd1da84db45d8fa65e9902fda6301cb"} Mar 19 09:27:00.935506 master-0 kubenswrapper[15202]: I0319 09:27:00.935375 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:27:01.779046 master-0 kubenswrapper[15202]: I0319 09:27:01.778792 15202 generic.go:334] "Generic (PLEG): container finished" podID="9a8f8ced-6f9c-44ec-885d-da84f0ae27ae" containerID="8e35994ac4e66a273d256ca917f2061f2d406a0e2abfa9ffab775cd8f53b1042" exitCode=0 Mar 19 09:27:01.780409 master-0 kubenswrapper[15202]: I0319 09:27:01.779000 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxzb9" event={"ID":"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae","Type":"ContainerDied","Data":"8e35994ac4e66a273d256ca917f2061f2d406a0e2abfa9ffab775cd8f53b1042"} Mar 19 09:27:01.781157 master-0 kubenswrapper[15202]: I0319 09:27:01.781096 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"161b2a221d5f01568762c88faef403a00e73faedbb819207dd179cf3b76fe882"} Mar 19 09:27:02.836568 master-0 kubenswrapper[15202]: I0319 09:27:02.508922 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xhcnv" podStartSLOduration=3.508693342 podStartE2EDuration="3.508693342s" podCreationTimestamp="2026-03-19 09:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:01.004232521 +0000 UTC m=+138.389647337" watchObservedRunningTime="2026-03-19 09:27:02.508693342 +0000 UTC m=+139.894108178" Mar 19 09:27:02.836568 master-0 kubenswrapper[15202]: I0319 09:27:02.792555 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxzb9" event={"ID":"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae","Type":"ContainerStarted","Data":"b81d743365436c027d7f9c8d9e042f84a3d353f013396a3c42c86f1bb6200566"} Mar 19 09:27:04.812664 master-0 kubenswrapper[15202]: I0319 09:27:04.812555 15202 scope.go:117] "RemoveContainer" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" Mar 19 09:27:04.813230 master-0 kubenswrapper[15202]: E0319 09:27:04.812981 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=console-operator pod=console-operator-76b6568d85-grltt_openshift-console-operator(269465d8-91d6-40d7-bfde-3eff9b93c1cf)\"" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podUID="269465d8-91d6-40d7-bfde-3eff9b93c1cf" Mar 19 09:27:05.823128 master-0 kubenswrapper[15202]: I0319 09:27:05.823001 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fxzb9" event={"ID":"9a8f8ced-6f9c-44ec-885d-da84f0ae27ae","Type":"ContainerStarted","Data":"b114982fbc8d1eb42b9dfb0c5ec90dcfa28c9ab9870c4e900b79fba8fb12a078"} Mar 19 09:27:12.261178 master-0 kubenswrapper[15202]: I0319 09:27:12.261099 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7d7bcd498-w2pfb"] Mar 19 09:27:12.265439 master-0 kubenswrapper[15202]: I0319 09:27:12.265365 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.281995 master-0 kubenswrapper[15202]: I0319 09:27:12.281929 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:27:12.282290 master-0 kubenswrapper[15202]: I0319 09:27:12.282258 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:27:12.282788 master-0 kubenswrapper[15202]: I0319 09:27:12.282754 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:27:12.282949 master-0 kubenswrapper[15202]: I0319 09:27:12.282907 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:27:12.285363 master-0 kubenswrapper[15202]: I0319 09:27:12.285313 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:27:12.285712 master-0 kubenswrapper[15202]: I0319 09:27:12.285669 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-17lh7pj6890g7" Mar 19 09:27:12.360613 master-0 kubenswrapper[15202]: I0319 09:27:12.360536 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8163b1-50e4-4fa7-9324-fe74c24549de-metrics-client-ca\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.360613 master-0 kubenswrapper[15202]: I0319 09:27:12.360618 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.360907 master-0 kubenswrapper[15202]: I0319 09:27:12.360668 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.360907 master-0 kubenswrapper[15202]: I0319 09:27:12.360707 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.360907 master-0 kubenswrapper[15202]: I0319 09:27:12.360726 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-grpc-tls\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.361117 master-0 kubenswrapper[15202]: I0319 09:27:12.361040 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/8e8163b1-50e4-4fa7-9324-fe74c24549de-kube-api-access-mg9pn\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.361366 master-0 kubenswrapper[15202]: I0319 09:27:12.361298 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-tls\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.361495 master-0 kubenswrapper[15202]: I0319 09:27:12.361446 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.463089 master-0 kubenswrapper[15202]: I0319 09:27:12.463018 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.463343 master-0 kubenswrapper[15202]: I0319 09:27:12.463222 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8163b1-50e4-4fa7-9324-fe74c24549de-metrics-client-ca\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.463343 master-0 kubenswrapper[15202]: I0319 09:27:12.463279 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.463343 master-0 kubenswrapper[15202]: I0319 09:27:12.463311 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.463485 master-0 kubenswrapper[15202]: I0319 09:27:12.463429 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.464074 master-0 kubenswrapper[15202]: I0319 09:27:12.464045 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-grpc-tls\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.464142 master-0 kubenswrapper[15202]: I0319 09:27:12.464100 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/8e8163b1-50e4-4fa7-9324-fe74c24549de-kube-api-access-mg9pn\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.464551 master-0 kubenswrapper[15202]: I0319 09:27:12.464143 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-tls\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.464785 master-0 kubenswrapper[15202]: I0319 09:27:12.464726 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e8163b1-50e4-4fa7-9324-fe74c24549de-metrics-client-ca\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.467242 master-0 kubenswrapper[15202]: I0319 09:27:12.466692 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.467242 master-0 kubenswrapper[15202]: I0319 09:27:12.466826 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.467242 master-0 kubenswrapper[15202]: I0319 09:27:12.467174 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.469420 master-0 kubenswrapper[15202]: I0319 09:27:12.469375 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-tls\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.469767 master-0 kubenswrapper[15202]: I0319 09:27:12.469713 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.471447 master-0 kubenswrapper[15202]: I0319 09:27:12.471352 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e8163b1-50e4-4fa7-9324-fe74c24549de-secret-grpc-tls\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:12.961701 master-0 kubenswrapper[15202]: I0319 09:27:12.961591 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7d7bcd498-w2pfb"] Mar 19 09:27:13.485654 master-0 kubenswrapper[15202]: I0319 09:27:13.484043 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fxzb9" podStartSLOduration=13.997768962 podStartE2EDuration="15.484025247s" podCreationTimestamp="2026-03-19 09:26:58 +0000 UTC" firstStartedPulling="2026-03-19 09:26:58.846100977 +0000 UTC m=+136.231515793" lastFinishedPulling="2026-03-19 09:27:00.332357262 +0000 UTC m=+137.717772078" observedRunningTime="2026-03-19 09:27:13.482089897 +0000 UTC m=+150.867504713" watchObservedRunningTime="2026-03-19 09:27:13.484025247 +0000 UTC m=+150.869440063" Mar 19 09:27:14.323951 master-0 kubenswrapper[15202]: I0319 09:27:14.323904 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/8e8163b1-50e4-4fa7-9324-fe74c24549de-kube-api-access-mg9pn\") pod \"thanos-querier-7d7bcd498-w2pfb\" (UID: \"8e8163b1-50e4-4fa7-9324-fe74c24549de\") " pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:14.390679 master-0 kubenswrapper[15202]: I0319 09:27:14.390618 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:15.131599 master-0 kubenswrapper[15202]: I0319 09:27:15.131374 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8c858dd9d-j8mx9"] Mar 19 09:27:15.132352 master-0 kubenswrapper[15202]: I0319 09:27:15.132319 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.139979 master-0 kubenswrapper[15202]: I0319 09:27:15.137008 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:27:15.139979 master-0 kubenswrapper[15202]: I0319 09:27:15.137220 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:27:15.139979 master-0 kubenswrapper[15202]: I0319 09:27:15.137379 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:27:15.139979 master-0 kubenswrapper[15202]: I0319 09:27:15.137539 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5339u6k6jn3h3" Mar 19 09:27:15.139979 master-0 kubenswrapper[15202]: I0319 09:27:15.137688 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:27:15.182923 master-0 kubenswrapper[15202]: I0319 09:27:15.182864 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8c858dd9d-j8mx9"] Mar 19 09:27:15.228443 master-0 kubenswrapper[15202]: I0319 09:27:15.228354 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2bw\" (UniqueName: \"kubernetes.io/projected/752fcbfa-1386-4b68-ac42-5ace89d63908-kube-api-access-ss2bw\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.228443 master-0 kubenswrapper[15202]: I0319 09:27:15.228436 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/752fcbfa-1386-4b68-ac42-5ace89d63908-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.228820 master-0 kubenswrapper[15202]: I0319 09:27:15.228505 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-client-ca-bundle\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.228820 master-0 kubenswrapper[15202]: I0319 09:27:15.228534 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-secret-metrics-client-certs\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.228820 master-0 kubenswrapper[15202]: I0319 09:27:15.228568 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/752fcbfa-1386-4b68-ac42-5ace89d63908-audit-log\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.228820 master-0 kubenswrapper[15202]: I0319 09:27:15.228599 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-secret-metrics-server-tls\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.228820 master-0 kubenswrapper[15202]: I0319 09:27:15.228665 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/752fcbfa-1386-4b68-ac42-5ace89d63908-metrics-server-audit-profiles\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329395 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/752fcbfa-1386-4b68-ac42-5ace89d63908-metrics-server-audit-profiles\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329536 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2bw\" (UniqueName: \"kubernetes.io/projected/752fcbfa-1386-4b68-ac42-5ace89d63908-kube-api-access-ss2bw\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329579 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/752fcbfa-1386-4b68-ac42-5ace89d63908-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329624 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-client-ca-bundle\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329656 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-secret-metrics-client-certs\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329692 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/752fcbfa-1386-4b68-ac42-5ace89d63908-audit-log\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.331503 master-0 kubenswrapper[15202]: I0319 09:27:15.329733 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-secret-metrics-server-tls\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.336560 master-0 kubenswrapper[15202]: I0319 09:27:15.333993 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-secret-metrics-server-tls\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.357275 master-0 kubenswrapper[15202]: I0319 09:27:15.355173 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/752fcbfa-1386-4b68-ac42-5ace89d63908-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.357275 master-0 kubenswrapper[15202]: I0319 09:27:15.356106 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/752fcbfa-1386-4b68-ac42-5ace89d63908-metrics-server-audit-profiles\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.357275 master-0 kubenswrapper[15202]: I0319 09:27:15.356594 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/752fcbfa-1386-4b68-ac42-5ace89d63908-audit-log\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.392226 master-0 kubenswrapper[15202]: I0319 09:27:15.390696 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-client-ca-bundle\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.401588 master-0 kubenswrapper[15202]: I0319 09:27:15.395584 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/752fcbfa-1386-4b68-ac42-5ace89d63908-secret-metrics-client-certs\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.428390 master-0 kubenswrapper[15202]: I0319 09:27:15.428326 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2bw\" (UniqueName: \"kubernetes.io/projected/752fcbfa-1386-4b68-ac42-5ace89d63908-kube-api-access-ss2bw\") pod \"metrics-server-8c858dd9d-j8mx9\" (UID: \"752fcbfa-1386-4b68-ac42-5ace89d63908\") " pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.452769 master-0 kubenswrapper[15202]: I0319 09:27:15.452367 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58fff6b545-fvbrw"] Mar 19 09:27:15.452769 master-0 kubenswrapper[15202]: I0319 09:27:15.452744 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" podUID="e58349d4-1322-4ebe-a513-146773f77a4b" containerName="controller-manager" containerID="cri-o://ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730" gracePeriod=30 Mar 19 09:27:15.456877 master-0 kubenswrapper[15202]: I0319 09:27:15.455976 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:15.499699 master-0 kubenswrapper[15202]: I0319 09:27:15.499588 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd"] Mar 19 09:27:15.503551 master-0 kubenswrapper[15202]: I0319 09:27:15.500502 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" podUID="80dad6a1-700f-4953-88e2-edc17468af14" containerName="route-controller-manager" containerID="cri-o://8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407" gracePeriod=30 Mar 19 09:27:15.503551 master-0 kubenswrapper[15202]: E0319 09:27:15.503122 15202 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58349d4_1322_4ebe_a513_146773f77a4b.slice/crio-conmon-ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:27:15.722151 master-0 kubenswrapper[15202]: I0319 09:27:15.722095 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc"] Mar 19 09:27:15.723153 master-0 kubenswrapper[15202]: I0319 09:27:15.723119 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.727153 master-0 kubenswrapper[15202]: I0319 09:27:15.727105 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:27:15.727488 master-0 kubenswrapper[15202]: I0319 09:27:15.727460 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:27:15.736890 master-0 kubenswrapper[15202]: I0319 09:27:15.736840 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff4eea3a-e218-4c34-adcf-84c4d7dea325-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-dwqmc\" (UID: \"ff4eea3a-e218-4c34-adcf-84c4d7dea325\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.737047 master-0 kubenswrapper[15202]: I0319 09:27:15.737020 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eea3a-e218-4c34-adcf-84c4d7dea325-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-dwqmc\" (UID: \"ff4eea3a-e218-4c34-adcf-84c4d7dea325\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.738396 master-0 kubenswrapper[15202]: I0319 09:27:15.738374 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc"] Mar 19 09:27:15.839413 master-0 kubenswrapper[15202]: I0319 09:27:15.837998 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eea3a-e218-4c34-adcf-84c4d7dea325-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-dwqmc\" (UID: \"ff4eea3a-e218-4c34-adcf-84c4d7dea325\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.839413 master-0 kubenswrapper[15202]: I0319 09:27:15.838173 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff4eea3a-e218-4c34-adcf-84c4d7dea325-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-dwqmc\" (UID: \"ff4eea3a-e218-4c34-adcf-84c4d7dea325\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.839413 master-0 kubenswrapper[15202]: I0319 09:27:15.838826 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ff4eea3a-e218-4c34-adcf-84c4d7dea325-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-dwqmc\" (UID: \"ff4eea3a-e218-4c34-adcf-84c4d7dea325\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.841166 master-0 kubenswrapper[15202]: I0319 09:27:15.841137 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff4eea3a-e218-4c34-adcf-84c4d7dea325-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-dwqmc\" (UID: \"ff4eea3a-e218-4c34-adcf-84c4d7dea325\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:15.910652 master-0 kubenswrapper[15202]: I0319 09:27:15.910605 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" event={"ID":"90ebca14-2ef4-4875-a682-48d7cc6fdd63","Type":"ContainerStarted","Data":"37ecba0a3a5159c804500afbeeb105247dd20ebff622e918541cb552f906a1dc"} Mar 19 09:27:16.042810 master-0 kubenswrapper[15202]: I0319 09:27:16.042687 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" Mar 19 09:27:16.432458 master-0 kubenswrapper[15202]: I0319 09:27:16.431108 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc"] Mar 19 09:27:16.432458 master-0 kubenswrapper[15202]: I0319 09:27:16.432205 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:16.435137 master-0 kubenswrapper[15202]: I0319 09:27:16.435027 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-j956x" Mar 19 09:27:16.435137 master-0 kubenswrapper[15202]: I0319 09:27:16.435124 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:27:16.453558 master-0 kubenswrapper[15202]: I0319 09:27:16.453504 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15566e56-f6ea-4628-87cd-c6151735cea3-monitoring-plugin-cert\") pod \"monitoring-plugin-5d7d9df6f8-qwngc\" (UID: \"15566e56-f6ea-4628-87cd-c6151735cea3\") " pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:16.486402 master-0 kubenswrapper[15202]: I0319 09:27:16.485599 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc"] Mar 19 09:27:16.554739 master-0 kubenswrapper[15202]: I0319 09:27:16.554690 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15566e56-f6ea-4628-87cd-c6151735cea3-monitoring-plugin-cert\") pod \"monitoring-plugin-5d7d9df6f8-qwngc\" (UID: \"15566e56-f6ea-4628-87cd-c6151735cea3\") " pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:16.558586 master-0 kubenswrapper[15202]: I0319 09:27:16.558548 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/15566e56-f6ea-4628-87cd-c6151735cea3-monitoring-plugin-cert\") pod \"monitoring-plugin-5d7d9df6f8-qwngc\" (UID: \"15566e56-f6ea-4628-87cd-c6151735cea3\") " pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:16.613873 master-0 kubenswrapper[15202]: I0319 09:27:16.613811 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7d7bcd498-w2pfb"] Mar 19 09:27:16.665842 master-0 kubenswrapper[15202]: I0319 09:27:16.665812 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:27:16.671721 master-0 kubenswrapper[15202]: I0319 09:27:16.671678 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8c858dd9d-j8mx9"] Mar 19 09:27:16.676623 master-0 kubenswrapper[15202]: W0319 09:27:16.676588 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod752fcbfa_1386_4b68_ac42_5ace89d63908.slice/crio-e8ecb5005bbf3e8676e21e25d8a2a73b25f1913607c6949dabf8e0d5dacef6bd WatchSource:0}: Error finding container e8ecb5005bbf3e8676e21e25d8a2a73b25f1913607c6949dabf8e0d5dacef6bd: Status 404 returned error can't find the container with id e8ecb5005bbf3e8676e21e25d8a2a73b25f1913607c6949dabf8e0d5dacef6bd Mar 19 09:27:16.682119 master-0 kubenswrapper[15202]: I0319 09:27:16.681855 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:27:16.777236 master-0 kubenswrapper[15202]: I0319 09:27:16.771949 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:16.779012 master-0 kubenswrapper[15202]: I0319 09:27:16.778660 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc"] Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: I0319 09:27:16.796873 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67d4b5c54d-v56p6"] Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: E0319 09:27:16.797138 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58349d4-1322-4ebe-a513-146773f77a4b" containerName="controller-manager" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: I0319 09:27:16.797149 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58349d4-1322-4ebe-a513-146773f77a4b" containerName="controller-manager" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: E0319 09:27:16.797180 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80dad6a1-700f-4953-88e2-edc17468af14" containerName="route-controller-manager" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: I0319 09:27:16.797187 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="80dad6a1-700f-4953-88e2-edc17468af14" containerName="route-controller-manager" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: I0319 09:27:16.797313 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="80dad6a1-700f-4953-88e2-edc17468af14" containerName="route-controller-manager" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: I0319 09:27:16.797329 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="e58349d4-1322-4ebe-a513-146773f77a4b" containerName="controller-manager" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: I0319 09:27:16.797823 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:16.812762 master-0 kubenswrapper[15202]: W0319 09:27:16.804387 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4eea3a_e218_4c34_adcf_84c4d7dea325.slice/crio-798f9471bf285e081a39d7d4a4863afb01f0865b0a28bc77c365e3e88257b878 WatchSource:0}: Error finding container 798f9471bf285e081a39d7d4a4863afb01f0865b0a28bc77c365e3e88257b878: Status 404 returned error can't find the container with id 798f9471bf285e081a39d7d4a4863afb01f0865b0a28bc77c365e3e88257b878 Mar 19 09:27:16.858537 master-0 kubenswrapper[15202]: I0319 09:27:16.858338 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-proxy-ca-bundles\") pod \"e58349d4-1322-4ebe-a513-146773f77a4b\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " Mar 19 09:27:16.858537 master-0 kubenswrapper[15202]: I0319 09:27:16.858396 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-client-ca\") pod \"e58349d4-1322-4ebe-a513-146773f77a4b\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " Mar 19 09:27:16.858537 master-0 kubenswrapper[15202]: I0319 09:27:16.858424 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58349d4-1322-4ebe-a513-146773f77a4b-serving-cert\") pod \"e58349d4-1322-4ebe-a513-146773f77a4b\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " Mar 19 09:27:16.858537 master-0 kubenswrapper[15202]: I0319 09:27:16.858480 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca\") pod \"80dad6a1-700f-4953-88e2-edc17468af14\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " Mar 19 09:27:16.858537 master-0 kubenswrapper[15202]: I0319 09:27:16.858502 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxjgf\" (UniqueName: \"kubernetes.io/projected/e58349d4-1322-4ebe-a513-146773f77a4b-kube-api-access-bxjgf\") pod \"e58349d4-1322-4ebe-a513-146773f77a4b\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " Mar 19 09:27:16.858537 master-0 kubenswrapper[15202]: I0319 09:27:16.858532 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80dad6a1-700f-4953-88e2-edc17468af14-serving-cert\") pod \"80dad6a1-700f-4953-88e2-edc17468af14\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " Mar 19 09:27:16.858867 master-0 kubenswrapper[15202]: I0319 09:27:16.858586 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcv9p\" (UniqueName: \"kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p\") pod \"80dad6a1-700f-4953-88e2-edc17468af14\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " Mar 19 09:27:16.858867 master-0 kubenswrapper[15202]: I0319 09:27:16.858621 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-config\") pod \"80dad6a1-700f-4953-88e2-edc17468af14\" (UID: \"80dad6a1-700f-4953-88e2-edc17468af14\") " Mar 19 09:27:16.858867 master-0 kubenswrapper[15202]: I0319 09:27:16.858680 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-config\") pod \"e58349d4-1322-4ebe-a513-146773f77a4b\" (UID: \"e58349d4-1322-4ebe-a513-146773f77a4b\") " Mar 19 09:27:16.860298 master-0 kubenswrapper[15202]: I0319 09:27:16.859299 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e58349d4-1322-4ebe-a513-146773f77a4b" (UID: "e58349d4-1322-4ebe-a513-146773f77a4b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:16.860298 master-0 kubenswrapper[15202]: I0319 09:27:16.859575 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca" (OuterVolumeSpecName: "client-ca") pod "80dad6a1-700f-4953-88e2-edc17468af14" (UID: "80dad6a1-700f-4953-88e2-edc17468af14"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:16.860298 master-0 kubenswrapper[15202]: I0319 09:27:16.859682 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-config" (OuterVolumeSpecName: "config") pod "80dad6a1-700f-4953-88e2-edc17468af14" (UID: "80dad6a1-700f-4953-88e2-edc17468af14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:16.860298 master-0 kubenswrapper[15202]: I0319 09:27:16.860003 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e58349d4-1322-4ebe-a513-146773f77a4b" (UID: "e58349d4-1322-4ebe-a513-146773f77a4b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:16.860298 master-0 kubenswrapper[15202]: I0319 09:27:16.860134 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-config" (OuterVolumeSpecName: "config") pod "e58349d4-1322-4ebe-a513-146773f77a4b" (UID: "e58349d4-1322-4ebe-a513-146773f77a4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:16.861252 master-0 kubenswrapper[15202]: I0319 09:27:16.861219 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58349d4-1322-4ebe-a513-146773f77a4b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e58349d4-1322-4ebe-a513-146773f77a4b" (UID: "e58349d4-1322-4ebe-a513-146773f77a4b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:16.862016 master-0 kubenswrapper[15202]: I0319 09:27:16.861978 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p" (OuterVolumeSpecName: "kube-api-access-bcv9p") pod "80dad6a1-700f-4953-88e2-edc17468af14" (UID: "80dad6a1-700f-4953-88e2-edc17468af14"). InnerVolumeSpecName "kube-api-access-bcv9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:16.862649 master-0 kubenswrapper[15202]: I0319 09:27:16.862613 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80dad6a1-700f-4953-88e2-edc17468af14-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "80dad6a1-700f-4953-88e2-edc17468af14" (UID: "80dad6a1-700f-4953-88e2-edc17468af14"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:16.871499 master-0 kubenswrapper[15202]: I0319 09:27:16.869171 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58349d4-1322-4ebe-a513-146773f77a4b-kube-api-access-bxjgf" (OuterVolumeSpecName: "kube-api-access-bxjgf") pod "e58349d4-1322-4ebe-a513-146773f77a4b" (UID: "e58349d4-1322-4ebe-a513-146773f77a4b"). InnerVolumeSpecName "kube-api-access-bxjgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:16.919362 master-0 kubenswrapper[15202]: I0319 09:27:16.919277 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" event={"ID":"ff4eea3a-e218-4c34-adcf-84c4d7dea325","Type":"ContainerStarted","Data":"798f9471bf285e081a39d7d4a4863afb01f0865b0a28bc77c365e3e88257b878"} Mar 19 09:27:16.922079 master-0 kubenswrapper[15202]: I0319 09:27:16.921172 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" event={"ID":"90ebca14-2ef4-4875-a682-48d7cc6fdd63","Type":"ContainerStarted","Data":"7078c14c563b909019b62640e7ba5172d0f6215601caf1c7e91252093e64af70"} Mar 19 09:27:16.922079 master-0 kubenswrapper[15202]: I0319 09:27:16.921223 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" event={"ID":"90ebca14-2ef4-4875-a682-48d7cc6fdd63","Type":"ContainerStarted","Data":"d62425682a57be7e749aa1887b491c63a1f2a4616ad12f8840c692cf9441978d"} Mar 19 09:27:16.923799 master-0 kubenswrapper[15202]: I0319 09:27:16.923756 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca"} Mar 19 09:27:16.925051 master-0 kubenswrapper[15202]: I0319 09:27:16.925023 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" event={"ID":"752fcbfa-1386-4b68-ac42-5ace89d63908","Type":"ContainerStarted","Data":"e8ecb5005bbf3e8676e21e25d8a2a73b25f1913607c6949dabf8e0d5dacef6bd"} Mar 19 09:27:16.926636 master-0 kubenswrapper[15202]: I0319 09:27:16.926594 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"5f05a032df9404bc73eb3cc46e1b9152660d4d6d0bf0e552669a8891e3609dd8"} Mar 19 09:27:16.928671 master-0 kubenswrapper[15202]: I0319 09:27:16.928591 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" event={"ID":"feb06c2f-79d5-4c1d-a8da-8db82de9b2f9","Type":"ContainerStarted","Data":"82f59b70b3750faf38c3032a4b4f04b2605043bb8c7cfc0130d0c3580967148f"} Mar 19 09:27:16.931743 master-0 kubenswrapper[15202]: I0319 09:27:16.930659 15202 generic.go:334] "Generic (PLEG): container finished" podID="e58349d4-1322-4ebe-a513-146773f77a4b" containerID="ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730" exitCode=0 Mar 19 09:27:16.931743 master-0 kubenswrapper[15202]: I0319 09:27:16.930686 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" Mar 19 09:27:16.931743 master-0 kubenswrapper[15202]: I0319 09:27:16.930752 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" event={"ID":"e58349d4-1322-4ebe-a513-146773f77a4b","Type":"ContainerDied","Data":"ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730"} Mar 19 09:27:16.931743 master-0 kubenswrapper[15202]: I0319 09:27:16.930788 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58fff6b545-fvbrw" event={"ID":"e58349d4-1322-4ebe-a513-146773f77a4b","Type":"ContainerDied","Data":"76323448a57e2c6d1dff9d86c363961b405e92b79f3ac8a4ce1eab048e4ba96a"} Mar 19 09:27:16.931743 master-0 kubenswrapper[15202]: I0319 09:27:16.930813 15202 scope.go:117] "RemoveContainer" containerID="ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730" Mar 19 09:27:16.933014 master-0 kubenswrapper[15202]: I0319 09:27:16.932537 15202 generic.go:334] "Generic (PLEG): container finished" podID="80dad6a1-700f-4953-88e2-edc17468af14" containerID="8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407" exitCode=0 Mar 19 09:27:16.933014 master-0 kubenswrapper[15202]: I0319 09:27:16.932570 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" event={"ID":"80dad6a1-700f-4953-88e2-edc17468af14","Type":"ContainerDied","Data":"8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407"} Mar 19 09:27:16.933014 master-0 kubenswrapper[15202]: I0319 09:27:16.932591 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" event={"ID":"80dad6a1-700f-4953-88e2-edc17468af14","Type":"ContainerDied","Data":"339c92f1f2459dd07c13ff3b29aa31e6d5f62e666e189d4e9fff298b2a7e288f"} Mar 19 09:27:16.933014 master-0 kubenswrapper[15202]: I0319 09:27:16.932701 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd" Mar 19 09:27:16.944625 master-0 kubenswrapper[15202]: I0319 09:27:16.944246 15202 scope.go:117] "RemoveContainer" containerID="ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730" Mar 19 09:27:16.945715 master-0 kubenswrapper[15202]: E0319 09:27:16.945686 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730\": container with ID starting with ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730 not found: ID does not exist" containerID="ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730" Mar 19 09:27:16.945865 master-0 kubenswrapper[15202]: I0319 09:27:16.945720 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730"} err="failed to get container status \"ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730\": rpc error: code = NotFound desc = could not find container \"ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730\": container with ID starting with ae1b58b7865bf5bc61280ea4c10032fe90135e8505628b5ac2bb5e64db21c730 not found: ID does not exist" Mar 19 09:27:16.945865 master-0 kubenswrapper[15202]: I0319 09:27:16.945742 15202 scope.go:117] "RemoveContainer" containerID="8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407" Mar 19 09:27:16.960957 master-0 kubenswrapper[15202]: I0319 09:27:16.960913 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2x8\" (UniqueName: \"kubernetes.io/projected/0f3617ef-6143-4fb4-8c84-90ce9c6be531-kube-api-access-wr2x8\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:16.961065 master-0 kubenswrapper[15202]: I0319 09:27:16.960975 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3617ef-6143-4fb4-8c84-90ce9c6be531-serving-cert\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:16.961065 master-0 kubenswrapper[15202]: I0319 09:27:16.960992 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-client-ca\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:16.961182 master-0 kubenswrapper[15202]: I0319 09:27:16.961137 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-proxy-ca-bundles\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:16.961704 master-0 kubenswrapper[15202]: I0319 09:27:16.961681 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-config\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:16.962032 master-0 kubenswrapper[15202]: I0319 09:27:16.962009 15202 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962101 master-0 kubenswrapper[15202]: I0319 09:27:16.962036 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962101 master-0 kubenswrapper[15202]: I0319 09:27:16.962045 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58349d4-1322-4ebe-a513-146773f77a4b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962101 master-0 kubenswrapper[15202]: I0319 09:27:16.962054 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962101 master-0 kubenswrapper[15202]: I0319 09:27:16.962064 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxjgf\" (UniqueName: \"kubernetes.io/projected/e58349d4-1322-4ebe-a513-146773f77a4b-kube-api-access-bxjgf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962101 master-0 kubenswrapper[15202]: I0319 09:27:16.962073 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80dad6a1-700f-4953-88e2-edc17468af14-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962292 master-0 kubenswrapper[15202]: I0319 09:27:16.962100 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcv9p\" (UniqueName: \"kubernetes.io/projected/80dad6a1-700f-4953-88e2-edc17468af14-kube-api-access-bcv9p\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962292 master-0 kubenswrapper[15202]: I0319 09:27:16.962121 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dad6a1-700f-4953-88e2-edc17468af14-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.962292 master-0 kubenswrapper[15202]: I0319 09:27:16.962132 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58349d4-1322-4ebe-a513-146773f77a4b-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:16.964967 master-0 kubenswrapper[15202]: I0319 09:27:16.964939 15202 scope.go:117] "RemoveContainer" containerID="8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407" Mar 19 09:27:16.965317 master-0 kubenswrapper[15202]: E0319 09:27:16.965288 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407\": container with ID starting with 8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407 not found: ID does not exist" containerID="8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407" Mar 19 09:27:16.965369 master-0 kubenswrapper[15202]: I0319 09:27:16.965312 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407"} err="failed to get container status \"8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407\": rpc error: code = NotFound desc = could not find container \"8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407\": container with ID starting with 8b97c1262572b5a775ce1365bc93226ce73a4bb17260a3de075236803b98f407 not found: ID does not exist" Mar 19 09:27:17.053643 master-0 kubenswrapper[15202]: I0319 09:27:17.050309 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67d4b5c54d-v56p6"] Mar 19 09:27:17.063097 master-0 kubenswrapper[15202]: I0319 09:27:17.063033 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-config\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.063383 master-0 kubenswrapper[15202]: I0319 09:27:17.063361 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2x8\" (UniqueName: \"kubernetes.io/projected/0f3617ef-6143-4fb4-8c84-90ce9c6be531-kube-api-access-wr2x8\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.063504 master-0 kubenswrapper[15202]: I0319 09:27:17.063457 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3617ef-6143-4fb4-8c84-90ce9c6be531-serving-cert\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.063651 master-0 kubenswrapper[15202]: I0319 09:27:17.063504 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-client-ca\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.063651 master-0 kubenswrapper[15202]: I0319 09:27:17.063580 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-proxy-ca-bundles\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.064737 master-0 kubenswrapper[15202]: I0319 09:27:17.064714 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-proxy-ca-bundles\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.068369 master-0 kubenswrapper[15202]: I0319 09:27:17.068312 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-config\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.070078 master-0 kubenswrapper[15202]: I0319 09:27:17.069600 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3617ef-6143-4fb4-8c84-90ce9c6be531-serving-cert\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.076232 master-0 kubenswrapper[15202]: I0319 09:27:17.071616 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-client-ca\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.095411 master-0 kubenswrapper[15202]: I0319 09:27:17.091233 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd"] Mar 19 09:27:17.095411 master-0 kubenswrapper[15202]: I0319 09:27:17.091759 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2x8\" (UniqueName: \"kubernetes.io/projected/0f3617ef-6143-4fb4-8c84-90ce9c6be531-kube-api-access-wr2x8\") pod \"controller-manager-67d4b5c54d-v56p6\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.095924 master-0 kubenswrapper[15202]: I0319 09:27:17.095868 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f758fb97d-qmbkd"] Mar 19 09:27:17.113415 master-0 kubenswrapper[15202]: I0319 09:27:17.113344 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58fff6b545-fvbrw"] Mar 19 09:27:17.117511 master-0 kubenswrapper[15202]: I0319 09:27:17.116937 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58fff6b545-fvbrw"] Mar 19 09:27:17.127741 master-0 kubenswrapper[15202]: I0319 09:27:17.127691 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.155107 master-0 kubenswrapper[15202]: I0319 09:27:17.154682 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-gh4px" podStartSLOduration=3.768758327 podStartE2EDuration="19.154657907s" podCreationTimestamp="2026-03-19 09:26:58 +0000 UTC" firstStartedPulling="2026-03-19 09:27:00.619534606 +0000 UTC m=+138.004949442" lastFinishedPulling="2026-03-19 09:27:16.005434206 +0000 UTC m=+153.390849022" observedRunningTime="2026-03-19 09:27:17.151005424 +0000 UTC m=+154.536420260" watchObservedRunningTime="2026-03-19 09:27:17.154657907 +0000 UTC m=+154.540072723" Mar 19 09:27:17.184211 master-0 kubenswrapper[15202]: I0319 09:27:17.184145 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7bbc969446-vjbnk" podStartSLOduration=4.959487852 podStartE2EDuration="19.184121473s" podCreationTimestamp="2026-03-19 09:26:58 +0000 UTC" firstStartedPulling="2026-03-19 09:27:00.00746239 +0000 UTC m=+137.392877206" lastFinishedPulling="2026-03-19 09:27:14.232096011 +0000 UTC m=+151.617510827" observedRunningTime="2026-03-19 09:27:17.181447734 +0000 UTC m=+154.566862550" watchObservedRunningTime="2026-03-19 09:27:17.184121473 +0000 UTC m=+154.569536289" Mar 19 09:27:17.215342 master-0 kubenswrapper[15202]: I0319 09:27:17.215299 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc"] Mar 19 09:27:17.596059 master-0 kubenswrapper[15202]: I0319 09:27:17.595627 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67d4b5c54d-v56p6"] Mar 19 09:27:17.612937 master-0 kubenswrapper[15202]: W0319 09:27:17.612810 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3617ef_6143_4fb4_8c84_90ce9c6be531.slice/crio-85160d2661d6ff156f0b320837a285934abdff11c671b179d0f756da855bc914 WatchSource:0}: Error finding container 85160d2661d6ff156f0b320837a285934abdff11c671b179d0f756da855bc914: Status 404 returned error can't find the container with id 85160d2661d6ff156f0b320837a285934abdff11c671b179d0f756da855bc914 Mar 19 09:27:17.948310 master-0 kubenswrapper[15202]: I0319 09:27:17.948205 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" event={"ID":"0f3617ef-6143-4fb4-8c84-90ce9c6be531","Type":"ContainerStarted","Data":"17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc"} Mar 19 09:27:17.948310 master-0 kubenswrapper[15202]: I0319 09:27:17.948274 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" event={"ID":"0f3617ef-6143-4fb4-8c84-90ce9c6be531","Type":"ContainerStarted","Data":"85160d2661d6ff156f0b320837a285934abdff11c671b179d0f756da855bc914"} Mar 19 09:27:17.948668 master-0 kubenswrapper[15202]: I0319 09:27:17.948632 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:17.953251 master-0 kubenswrapper[15202]: I0319 09:27:17.952731 15202 patch_prober.go:28] interesting pod/controller-manager-67d4b5c54d-v56p6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.88:8443/healthz\": dial tcp 10.128.0.88:8443: connect: connection refused" start-of-body= Mar 19 09:27:17.953251 master-0 kubenswrapper[15202]: I0319 09:27:17.952795 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.88:8443/healthz\": dial tcp 10.128.0.88:8443: connect: connection refused" Mar 19 09:27:17.954529 master-0 kubenswrapper[15202]: I0319 09:27:17.953722 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca" exitCode=0 Mar 19 09:27:17.954529 master-0 kubenswrapper[15202]: I0319 09:27:17.953793 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca"} Mar 19 09:27:17.957922 master-0 kubenswrapper[15202]: I0319 09:27:17.957885 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" event={"ID":"15566e56-f6ea-4628-87cd-c6151735cea3","Type":"ContainerStarted","Data":"5b95525b5e4eef1e8087287c8a4acac35ca6e9b0a5c59c50de4170637e746e51"} Mar 19 09:27:18.305531 master-0 kubenswrapper[15202]: I0319 09:27:18.305361 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" podStartSLOduration=3.305335207 podStartE2EDuration="3.305335207s" podCreationTimestamp="2026-03-19 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:18.146234126 +0000 UTC m=+155.531648952" watchObservedRunningTime="2026-03-19 09:27:18.305335207 +0000 UTC m=+155.690750023" Mar 19 09:27:18.822286 master-0 kubenswrapper[15202]: I0319 09:27:18.822216 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80dad6a1-700f-4953-88e2-edc17468af14" path="/var/lib/kubelet/pods/80dad6a1-700f-4953-88e2-edc17468af14/volumes" Mar 19 09:27:18.822974 master-0 kubenswrapper[15202]: I0319 09:27:18.822948 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58349d4-1322-4ebe-a513-146773f77a4b" path="/var/lib/kubelet/pods/e58349d4-1322-4ebe-a513-146773f77a4b/volumes" Mar 19 09:27:18.973149 master-0 kubenswrapper[15202]: I0319 09:27:18.972738 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:27:19.265538 master-0 kubenswrapper[15202]: I0319 09:27:19.265176 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv"] Mar 19 09:27:19.266544 master-0 kubenswrapper[15202]: I0319 09:27:19.266516 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.270427 master-0 kubenswrapper[15202]: I0319 09:27:19.270374 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:27:19.270721 master-0 kubenswrapper[15202]: I0319 09:27:19.270678 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:27:19.270805 master-0 kubenswrapper[15202]: I0319 09:27:19.270762 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-tgqwm" Mar 19 09:27:19.271214 master-0 kubenswrapper[15202]: I0319 09:27:19.271168 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:27:19.271388 master-0 kubenswrapper[15202]: I0319 09:27:19.271345 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:27:19.273207 master-0 kubenswrapper[15202]: I0319 09:27:19.273182 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:27:19.301540 master-0 kubenswrapper[15202]: I0319 09:27:19.298779 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv"] Mar 19 09:27:19.304670 master-0 kubenswrapper[15202]: I0319 09:27:19.304624 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-config\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.304791 master-0 kubenswrapper[15202]: I0319 09:27:19.304684 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdmhj\" (UniqueName: \"kubernetes.io/projected/a2f7f5e9-658c-44a4-a42a-544247b24195-kube-api-access-bdmhj\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.304791 master-0 kubenswrapper[15202]: I0319 09:27:19.304749 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f7f5e9-658c-44a4-a42a-544247b24195-serving-cert\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.304791 master-0 kubenswrapper[15202]: I0319 09:27:19.304767 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-client-ca\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.406667 master-0 kubenswrapper[15202]: I0319 09:27:19.406581 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f7f5e9-658c-44a4-a42a-544247b24195-serving-cert\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.406667 master-0 kubenswrapper[15202]: I0319 09:27:19.406663 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-client-ca\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.406969 master-0 kubenswrapper[15202]: I0319 09:27:19.406725 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-config\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.406969 master-0 kubenswrapper[15202]: I0319 09:27:19.406779 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdmhj\" (UniqueName: \"kubernetes.io/projected/a2f7f5e9-658c-44a4-a42a-544247b24195-kube-api-access-bdmhj\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.408913 master-0 kubenswrapper[15202]: I0319 09:27:19.408753 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-client-ca\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.409413 master-0 kubenswrapper[15202]: I0319 09:27:19.409308 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-config\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.422646 master-0 kubenswrapper[15202]: I0319 09:27:19.422601 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f7f5e9-658c-44a4-a42a-544247b24195-serving-cert\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.427529 master-0 kubenswrapper[15202]: I0319 09:27:19.427489 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdmhj\" (UniqueName: \"kubernetes.io/projected/a2f7f5e9-658c-44a4-a42a-544247b24195-kube-api-access-bdmhj\") pod \"route-controller-manager-598f995956-qbmvv\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.587518 master-0 kubenswrapper[15202]: I0319 09:27:19.587304 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:19.812122 master-0 kubenswrapper[15202]: I0319 09:27:19.812052 15202 scope.go:117] "RemoveContainer" containerID="c3296f5ca353368f20bb7becc372b39ea3d6cff0940a0d1437ce89c351a61bf7" Mar 19 09:27:21.426534 master-0 kubenswrapper[15202]: I0319 09:27:21.426381 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv"] Mar 19 09:27:21.465208 master-0 kubenswrapper[15202]: I0319 09:27:21.465168 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:27:21.474698 master-0 kubenswrapper[15202]: I0319 09:27:21.474494 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.482243 master-0 kubenswrapper[15202]: I0319 09:27:21.482187 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:27:21.482482 master-0 kubenswrapper[15202]: I0319 09:27:21.482402 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:27:21.482935 master-0 kubenswrapper[15202]: I0319 09:27:21.482878 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:27:21.483847 master-0 kubenswrapper[15202]: I0319 09:27:21.483002 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4i3vpe46p0rrq" Mar 19 09:27:21.483847 master-0 kubenswrapper[15202]: I0319 09:27:21.483164 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:27:21.483847 master-0 kubenswrapper[15202]: I0319 09:27:21.483273 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:27:21.483847 master-0 kubenswrapper[15202]: I0319 09:27:21.483486 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:27:21.483847 master-0 kubenswrapper[15202]: I0319 09:27:21.483623 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:27:21.484663 master-0 kubenswrapper[15202]: I0319 09:27:21.484233 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:27:21.484663 master-0 kubenswrapper[15202]: I0319 09:27:21.484348 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:27:21.493020 master-0 kubenswrapper[15202]: I0319 09:27:21.492909 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:27:21.495173 master-0 kubenswrapper[15202]: I0319 09:27:21.494347 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:27:21.526550 master-0 kubenswrapper[15202]: I0319 09:27:21.526383 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:27:21.651773 master-0 kubenswrapper[15202]: I0319 09:27:21.651676 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652024 master-0 kubenswrapper[15202]: I0319 09:27:21.651798 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652024 master-0 kubenswrapper[15202]: I0319 09:27:21.651840 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652024 master-0 kubenswrapper[15202]: I0319 09:27:21.651873 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652024 master-0 kubenswrapper[15202]: I0319 09:27:21.651907 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652024 master-0 kubenswrapper[15202]: I0319 09:27:21.651956 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652024 master-0 kubenswrapper[15202]: I0319 09:27:21.651997 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652285 master-0 kubenswrapper[15202]: I0319 09:27:21.652063 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652285 master-0 kubenswrapper[15202]: I0319 09:27:21.652165 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-web-config\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652344 master-0 kubenswrapper[15202]: I0319 09:27:21.652293 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652404 master-0 kubenswrapper[15202]: I0319 09:27:21.652380 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-config-out\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652480 master-0 kubenswrapper[15202]: I0319 09:27:21.652441 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652529 master-0 kubenswrapper[15202]: I0319 09:27:21.652490 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652560 master-0 kubenswrapper[15202]: I0319 09:27:21.652539 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652595 master-0 kubenswrapper[15202]: I0319 09:27:21.652564 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6t4k\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-kube-api-access-l6t4k\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652657 master-0 kubenswrapper[15202]: I0319 09:27:21.652596 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652714 master-0 kubenswrapper[15202]: I0319 09:27:21.652692 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-config\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.652775 master-0 kubenswrapper[15202]: I0319 09:27:21.652755 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755023 master-0 kubenswrapper[15202]: I0319 09:27:21.754937 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-config\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755427 master-0 kubenswrapper[15202]: I0319 09:27:21.755398 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755532 master-0 kubenswrapper[15202]: I0319 09:27:21.755442 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755599 master-0 kubenswrapper[15202]: I0319 09:27:21.755528 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755599 master-0 kubenswrapper[15202]: I0319 09:27:21.755558 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755599 master-0 kubenswrapper[15202]: I0319 09:27:21.755581 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755716 master-0 kubenswrapper[15202]: I0319 09:27:21.755616 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755716 master-0 kubenswrapper[15202]: I0319 09:27:21.755695 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755784 master-0 kubenswrapper[15202]: I0319 09:27:21.755729 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755784 master-0 kubenswrapper[15202]: I0319 09:27:21.755751 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755784 master-0 kubenswrapper[15202]: I0319 09:27:21.755781 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-web-config\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755877 master-0 kubenswrapper[15202]: I0319 09:27:21.755847 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755914 master-0 kubenswrapper[15202]: I0319 09:27:21.755893 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-config-out\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.755947 master-0 kubenswrapper[15202]: I0319 09:27:21.755927 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.756011 master-0 kubenswrapper[15202]: I0319 09:27:21.755957 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.756055 master-0 kubenswrapper[15202]: I0319 09:27:21.756029 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.756090 master-0 kubenswrapper[15202]: I0319 09:27:21.756059 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6t4k\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-kube-api-access-l6t4k\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.756090 master-0 kubenswrapper[15202]: I0319 09:27:21.756085 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.757494 master-0 kubenswrapper[15202]: I0319 09:27:21.757407 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.758439 master-0 kubenswrapper[15202]: I0319 09:27:21.758352 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.758991 master-0 kubenswrapper[15202]: I0319 09:27:21.758932 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.760370 master-0 kubenswrapper[15202]: I0319 09:27:21.760334 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.761534 master-0 kubenswrapper[15202]: I0319 09:27:21.760998 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.761534 master-0 kubenswrapper[15202]: I0319 09:27:21.761374 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.761874 master-0 kubenswrapper[15202]: I0319 09:27:21.761820 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-web-config\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.762150 master-0 kubenswrapper[15202]: I0319 09:27:21.762120 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.762540 master-0 kubenswrapper[15202]: I0319 09:27:21.762489 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.763175 master-0 kubenswrapper[15202]: I0319 09:27:21.763152 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-config\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.764010 master-0 kubenswrapper[15202]: I0319 09:27:21.763787 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.766954 master-0 kubenswrapper[15202]: I0319 09:27:21.766929 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.769586 master-0 kubenswrapper[15202]: I0319 09:27:21.767713 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.769586 master-0 kubenswrapper[15202]: I0319 09:27:21.768306 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.774205 master-0 kubenswrapper[15202]: I0319 09:27:21.773020 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-config-out\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.774205 master-0 kubenswrapper[15202]: I0319 09:27:21.773409 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.774205 master-0 kubenswrapper[15202]: I0319 09:27:21.774158 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.791334 master-0 kubenswrapper[15202]: I0319 09:27:21.791143 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6t4k\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-kube-api-access-l6t4k\") pod \"prometheus-k8s-0\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:21.858640 master-0 kubenswrapper[15202]: I0319 09:27:21.858441 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:22.051721 master-0 kubenswrapper[15202]: I0319 09:27:22.050733 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" event={"ID":"ff4eea3a-e218-4c34-adcf-84c4d7dea325","Type":"ContainerStarted","Data":"c9450b79dcad3a501672c18834e660e5626eb3f7839e026ba4190ab7195377a0"} Mar 19 09:27:22.060364 master-0 kubenswrapper[15202]: I0319 09:27:22.057902 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" event={"ID":"a2f7f5e9-658c-44a4-a42a-544247b24195","Type":"ContainerStarted","Data":"724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676"} Mar 19 09:27:22.060364 master-0 kubenswrapper[15202]: I0319 09:27:22.057957 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" event={"ID":"a2f7f5e9-658c-44a4-a42a-544247b24195","Type":"ContainerStarted","Data":"c8019030abd57d8f8a1e32054a156375adf80661f1587a2296060eba25b966e1"} Mar 19 09:27:22.060364 master-0 kubenswrapper[15202]: I0319 09:27:22.059131 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:22.060945 master-0 kubenswrapper[15202]: I0319 09:27:22.060807 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" event={"ID":"752fcbfa-1386-4b68-ac42-5ace89d63908","Type":"ContainerStarted","Data":"19fb792698c84bf8a7df379b8032ae7c8986ae0eaf4ceed60324dc086a74ac5b"} Mar 19 09:27:22.063201 master-0 kubenswrapper[15202]: I0319 09:27:22.063132 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"4d7576b26e0c5575de73859c8f0818ad5fe13524d8320cd3cea6f3b5fd265473"} Mar 19 09:27:22.063201 master-0 kubenswrapper[15202]: I0319 09:27:22.063156 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"34d1c77b35968c70d7a0a898c0ac71f5213c64f42de087a4be7e87d241b3b499"} Mar 19 09:27:22.063201 master-0 kubenswrapper[15202]: I0319 09:27:22.063170 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"7adc264194ffb35cce27c7c2708cdc1b80e5637a2c9492b34ac1e8a44ebac11f"} Mar 19 09:27:22.065754 master-0 kubenswrapper[15202]: I0319 09:27:22.064188 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" event={"ID":"15566e56-f6ea-4628-87cd-c6151735cea3","Type":"ContainerStarted","Data":"360b49df34fae108aae204de0dbc1ea3a7ef1405f4efecb0834d8cd04b1a2571"} Mar 19 09:27:22.065754 master-0 kubenswrapper[15202]: I0319 09:27:22.064880 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:22.067947 master-0 kubenswrapper[15202]: I0319 09:27:22.066561 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/3.log" Mar 19 09:27:22.067947 master-0 kubenswrapper[15202]: I0319 09:27:22.066593 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-grltt" event={"ID":"269465d8-91d6-40d7-bfde-3eff9b93c1cf","Type":"ContainerStarted","Data":"846bf6940ce9431f2d82db349049b45162724831f1bf49f0d1f86397feffa4c3"} Mar 19 09:27:22.067947 master-0 kubenswrapper[15202]: I0319 09:27:22.067370 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:27:22.071651 master-0 kubenswrapper[15202]: I0319 09:27:22.071614 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" Mar 19 09:27:22.163697 master-0 kubenswrapper[15202]: I0319 09:27:22.163612 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c6b76c555-dwqmc" podStartSLOduration=3.104815444 podStartE2EDuration="7.163495917s" podCreationTimestamp="2026-03-19 09:27:15 +0000 UTC" firstStartedPulling="2026-03-19 09:27:16.807590407 +0000 UTC m=+154.193005213" lastFinishedPulling="2026-03-19 09:27:20.86627087 +0000 UTC m=+158.251685686" observedRunningTime="2026-03-19 09:27:22.146213123 +0000 UTC m=+159.531627959" watchObservedRunningTime="2026-03-19 09:27:22.163495917 +0000 UTC m=+159.548910733" Mar 19 09:27:22.225265 master-0 kubenswrapper[15202]: I0319 09:27:22.225174 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5d7d9df6f8-qwngc" podStartSLOduration=2.579018125 podStartE2EDuration="6.225153108s" podCreationTimestamp="2026-03-19 09:27:16 +0000 UTC" firstStartedPulling="2026-03-19 09:27:17.226593453 +0000 UTC m=+154.612008269" lastFinishedPulling="2026-03-19 09:27:20.872728436 +0000 UTC m=+158.258143252" observedRunningTime="2026-03-19 09:27:22.21903661 +0000 UTC m=+159.604451426" watchObservedRunningTime="2026-03-19 09:27:22.225153108 +0000 UTC m=+159.610567924" Mar 19 09:27:22.230414 master-0 kubenswrapper[15202]: I0319 09:27:22.230373 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b6568d85-grltt" Mar 19 09:27:22.262887 master-0 kubenswrapper[15202]: I0319 09:27:22.262745 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" podStartSLOduration=3.072914286 podStartE2EDuration="7.262728902s" podCreationTimestamp="2026-03-19 09:27:15 +0000 UTC" firstStartedPulling="2026-03-19 09:27:16.679966444 +0000 UTC m=+154.065381260" lastFinishedPulling="2026-03-19 09:27:20.86978106 +0000 UTC m=+158.255195876" observedRunningTime="2026-03-19 09:27:22.260977256 +0000 UTC m=+159.646392072" watchObservedRunningTime="2026-03-19 09:27:22.262728902 +0000 UTC m=+159.648143718" Mar 19 09:27:22.379498 master-0 kubenswrapper[15202]: I0319 09:27:22.378669 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b6568d85-grltt" podStartSLOduration=96.125039559 podStartE2EDuration="1m38.378649014s" podCreationTimestamp="2026-03-19 09:25:44 +0000 UTC" firstStartedPulling="2026-03-19 09:25:44.997370566 +0000 UTC m=+62.382785382" lastFinishedPulling="2026-03-19 09:25:47.250980021 +0000 UTC m=+64.636394837" observedRunningTime="2026-03-19 09:27:22.375922964 +0000 UTC m=+159.761337780" watchObservedRunningTime="2026-03-19 09:27:22.378649014 +0000 UTC m=+159.764063830" Mar 19 09:27:22.388919 master-0 kubenswrapper[15202]: I0319 09:27:22.388796 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:27:22.438884 master-0 kubenswrapper[15202]: I0319 09:27:22.438813 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:27:22.581637 master-0 kubenswrapper[15202]: I0319 09:27:22.574590 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" podStartSLOduration=7.574556508 podStartE2EDuration="7.574556508s" podCreationTimestamp="2026-03-19 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:22.414550675 +0000 UTC m=+159.799965491" watchObservedRunningTime="2026-03-19 09:27:22.574556508 +0000 UTC m=+159.959971324" Mar 19 09:27:22.734154 master-0 kubenswrapper[15202]: I0319 09:27:22.734059 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" podUID="c656b2f4-785b-4403-b56a-637656900f07" containerName="oauth-openshift" containerID="cri-o://618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00" gracePeriod=15 Mar 19 09:27:22.855629 master-0 kubenswrapper[15202]: I0319 09:27:22.855564 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-66b8ffb895-7n68q"] Mar 19 09:27:22.856711 master-0 kubenswrapper[15202]: I0319 09:27:22.856551 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:27:22.860083 master-0 kubenswrapper[15202]: I0319 09:27:22.860059 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:27:22.860307 master-0 kubenswrapper[15202]: I0319 09:27:22.860290 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:27:22.880309 master-0 kubenswrapper[15202]: I0319 09:27:22.880228 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-7n68q"] Mar 19 09:27:22.971580 master-0 kubenswrapper[15202]: I0319 09:27:22.971426 15202 patch_prober.go:28] interesting pod/oauth-openshift-967b7967b-mb725 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.80:6443/healthz\": dial tcp 10.128.0.80:6443: connect: connection refused" start-of-body= Mar 19 09:27:22.972596 master-0 kubenswrapper[15202]: I0319 09:27:22.971611 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" podUID="c656b2f4-785b-4403-b56a-637656900f07" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.80:6443/healthz\": dial tcp 10.128.0.80:6443: connect: connection refused" Mar 19 09:27:22.991165 master-0 kubenswrapper[15202]: I0319 09:27:22.991074 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5x6k\" (UniqueName: \"kubernetes.io/projected/1dc7476c-75a8-40fe-93f7-fca31aa2ebda-kube-api-access-r5x6k\") pod \"downloads-66b8ffb895-7n68q\" (UID: \"1dc7476c-75a8-40fe-93f7-fca31aa2ebda\") " pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:27:23.096991 master-0 kubenswrapper[15202]: I0319 09:27:23.096836 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5x6k\" (UniqueName: \"kubernetes.io/projected/1dc7476c-75a8-40fe-93f7-fca31aa2ebda-kube-api-access-r5x6k\") pod \"downloads-66b8ffb895-7n68q\" (UID: \"1dc7476c-75a8-40fe-93f7-fca31aa2ebda\") " pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:27:23.117792 master-0 kubenswrapper[15202]: W0319 09:27:23.117683 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75eefc3a_d29d_499e_98fd_7292ff09c294.slice/crio-72318015826a9a6df10b0ba1b3beedcda5e582b26dc3e68735c4af6fa4b42d32 WatchSource:0}: Error finding container 72318015826a9a6df10b0ba1b3beedcda5e582b26dc3e68735c4af6fa4b42d32: Status 404 returned error can't find the container with id 72318015826a9a6df10b0ba1b3beedcda5e582b26dc3e68735c4af6fa4b42d32 Mar 19 09:27:23.162703 master-0 kubenswrapper[15202]: I0319 09:27:23.162623 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5x6k\" (UniqueName: \"kubernetes.io/projected/1dc7476c-75a8-40fe-93f7-fca31aa2ebda-kube-api-access-r5x6k\") pod \"downloads-66b8ffb895-7n68q\" (UID: \"1dc7476c-75a8-40fe-93f7-fca31aa2ebda\") " pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:27:23.239790 master-0 kubenswrapper[15202]: I0319 09:27:23.239703 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:27:23.697342 master-0 kubenswrapper[15202]: I0319 09:27:23.697260 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:27:23.767926 master-0 kubenswrapper[15202]: I0319 09:27:23.767700 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-848cd9b885-hcbh9"] Mar 19 09:27:23.768295 master-0 kubenswrapper[15202]: E0319 09:27:23.768251 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c656b2f4-785b-4403-b56a-637656900f07" containerName="oauth-openshift" Mar 19 09:27:23.768295 master-0 kubenswrapper[15202]: I0319 09:27:23.768277 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c656b2f4-785b-4403-b56a-637656900f07" containerName="oauth-openshift" Mar 19 09:27:23.768501 master-0 kubenswrapper[15202]: I0319 09:27:23.768446 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c656b2f4-785b-4403-b56a-637656900f07" containerName="oauth-openshift" Mar 19 09:27:23.769888 master-0 kubenswrapper[15202]: I0319 09:27:23.769112 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.834359 master-0 kubenswrapper[15202]: I0319 09:27:23.834280 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pphhm\" (UniqueName: \"kubernetes.io/projected/c656b2f4-785b-4403-b56a-637656900f07-kube-api-access-pphhm\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.834359 master-0 kubenswrapper[15202]: I0319 09:27:23.834357 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-serving-cert\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.834730 master-0 kubenswrapper[15202]: I0319 09:27:23.834403 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-provider-selection\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.834730 master-0 kubenswrapper[15202]: I0319 09:27:23.834536 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-cliconfig\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.836294 master-0 kubenswrapper[15202]: I0319 09:27:23.836253 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:23.836420 master-0 kubenswrapper[15202]: I0319 09:27:23.836399 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-ocp-branding-template\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.838283 master-0 kubenswrapper[15202]: I0319 09:27:23.838244 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-router-certs\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.838931 master-0 kubenswrapper[15202]: I0319 09:27:23.838897 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-error\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839006 master-0 kubenswrapper[15202]: I0319 09:27:23.838939 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c656b2f4-785b-4403-b56a-637656900f07-audit-dir\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839006 master-0 kubenswrapper[15202]: I0319 09:27:23.838968 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-audit-policies\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839006 master-0 kubenswrapper[15202]: I0319 09:27:23.839000 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-session\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839128 master-0 kubenswrapper[15202]: I0319 09:27:23.839047 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-trusted-ca-bundle\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839128 master-0 kubenswrapper[15202]: I0319 09:27:23.839068 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-service-ca\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839128 master-0 kubenswrapper[15202]: I0319 09:27:23.839091 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-login\") pod \"c656b2f4-785b-4403-b56a-637656900f07\" (UID: \"c656b2f4-785b-4403-b56a-637656900f07\") " Mar 19 09:27:23.839247 master-0 kubenswrapper[15202]: I0319 09:27:23.839214 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-session\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839247 master-0 kubenswrapper[15202]: I0319 09:27:23.839243 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839349 master-0 kubenswrapper[15202]: I0319 09:27:23.839266 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qptcn\" (UniqueName: \"kubernetes.io/projected/2ca4075f-8b54-49da-a2b8-6a22801f6607-kube-api-access-qptcn\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839349 master-0 kubenswrapper[15202]: I0319 09:27:23.839320 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-serving-cert\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839349 master-0 kubenswrapper[15202]: I0319 09:27:23.839338 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-error\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839486 master-0 kubenswrapper[15202]: I0319 09:27:23.839402 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839486 master-0 kubenswrapper[15202]: I0319 09:27:23.839423 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-cliconfig\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839486 master-0 kubenswrapper[15202]: I0319 09:27:23.839453 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-dir\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839594 master-0 kubenswrapper[15202]: I0319 09:27:23.839512 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839594 master-0 kubenswrapper[15202]: I0319 09:27:23.839538 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-router-certs\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839594 master-0 kubenswrapper[15202]: I0319 09:27:23.839565 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-service-ca\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839594 master-0 kubenswrapper[15202]: I0319 09:27:23.839590 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-policies\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839707 master-0 kubenswrapper[15202]: I0319 09:27:23.839608 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-login\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.839707 master-0 kubenswrapper[15202]: I0319 09:27:23.839675 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.840661 master-0 kubenswrapper[15202]: I0319 09:27:23.840434 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c656b2f4-785b-4403-b56a-637656900f07-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:23.840661 master-0 kubenswrapper[15202]: I0319 09:27:23.840550 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:23.841044 master-0 kubenswrapper[15202]: I0319 09:27:23.841006 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:23.841344 master-0 kubenswrapper[15202]: I0319 09:27:23.841300 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:23.842531 master-0 kubenswrapper[15202]: I0319 09:27:23.842455 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.842625 master-0 kubenswrapper[15202]: I0319 09:27:23.842585 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-848cd9b885-hcbh9"] Mar 19 09:27:23.842757 master-0 kubenswrapper[15202]: I0319 09:27:23.842713 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.842982 master-0 kubenswrapper[15202]: I0319 09:27:23.842950 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.843328 master-0 kubenswrapper[15202]: I0319 09:27:23.843256 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.844505 master-0 kubenswrapper[15202]: I0319 09:27:23.844426 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.855751 master-0 kubenswrapper[15202]: I0319 09:27:23.845489 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.855751 master-0 kubenswrapper[15202]: I0319 09:27:23.845989 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c656b2f4-785b-4403-b56a-637656900f07-kube-api-access-pphhm" (OuterVolumeSpecName: "kube-api-access-pphhm") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "kube-api-access-pphhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:23.855751 master-0 kubenswrapper[15202]: I0319 09:27:23.851097 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c656b2f4-785b-4403-b56a-637656900f07" (UID: "c656b2f4-785b-4403-b56a-637656900f07"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:23.885874 master-0 kubenswrapper[15202]: I0319 09:27:23.885672 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-7n68q"] Mar 19 09:27:23.941460 master-0 kubenswrapper[15202]: I0319 09:27:23.941263 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-session\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.941460 master-0 kubenswrapper[15202]: I0319 09:27:23.941339 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.947555 master-0 kubenswrapper[15202]: I0319 09:27:23.947461 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qptcn\" (UniqueName: \"kubernetes.io/projected/2ca4075f-8b54-49da-a2b8-6a22801f6607-kube-api-access-qptcn\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.947801 master-0 kubenswrapper[15202]: I0319 09:27:23.947770 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-serving-cert\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.947867 master-0 kubenswrapper[15202]: I0319 09:27:23.947827 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-error\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948011 master-0 kubenswrapper[15202]: I0319 09:27:23.947949 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948072 master-0 kubenswrapper[15202]: I0319 09:27:23.948038 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948122 master-0 kubenswrapper[15202]: I0319 09:27:23.948110 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-cliconfig\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948210 master-0 kubenswrapper[15202]: I0319 09:27:23.948196 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-dir\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948331 master-0 kubenswrapper[15202]: I0319 09:27:23.948309 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948411 master-0 kubenswrapper[15202]: I0319 09:27:23.948361 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-router-certs\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.948497 master-0 kubenswrapper[15202]: I0319 09:27:23.948426 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-service-ca\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.949971 master-0 kubenswrapper[15202]: I0319 09:27:23.949882 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-policies\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.949971 master-0 kubenswrapper[15202]: I0319 09:27:23.949925 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-login\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950234 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950284 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pphhm\" (UniqueName: \"kubernetes.io/projected/c656b2f4-785b-4403-b56a-637656900f07-kube-api-access-pphhm\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950302 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950327 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950344 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950368 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950390 master-0 kubenswrapper[15202]: I0319 09:27:23.950385 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950933 master-0 kubenswrapper[15202]: I0319 09:27:23.950404 15202 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c656b2f4-785b-4403-b56a-637656900f07-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950933 master-0 kubenswrapper[15202]: I0319 09:27:23.950424 15202 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950933 master-0 kubenswrapper[15202]: I0319 09:27:23.950445 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950933 master-0 kubenswrapper[15202]: I0319 09:27:23.950542 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950933 master-0 kubenswrapper[15202]: I0319 09:27:23.950564 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c656b2f4-785b-4403-b56a-637656900f07-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:27:23.950933 master-0 kubenswrapper[15202]: I0319 09:27:23.950717 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-serving-cert\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.951420 master-0 kubenswrapper[15202]: I0319 09:27:23.951370 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-cliconfig\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.951572 master-0 kubenswrapper[15202]: I0319 09:27:23.951456 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-service-ca\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.951572 master-0 kubenswrapper[15202]: I0319 09:27:23.951568 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-dir\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.961161 master-0 kubenswrapper[15202]: I0319 09:27:23.954289 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-policies\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.961161 master-0 kubenswrapper[15202]: I0319 09:27:23.956872 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-error\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.961161 master-0 kubenswrapper[15202]: I0319 09:27:23.957179 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.961161 master-0 kubenswrapper[15202]: I0319 09:27:23.958967 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.968439 master-0 kubenswrapper[15202]: I0319 09:27:23.966766 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-router-certs\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.968439 master-0 kubenswrapper[15202]: I0319 09:27:23.967601 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-session\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.970648 master-0 kubenswrapper[15202]: I0319 09:27:23.970290 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-login\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:23.972155 master-0 kubenswrapper[15202]: I0319 09:27:23.971744 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qptcn\" (UniqueName: \"kubernetes.io/projected/2ca4075f-8b54-49da-a2b8-6a22801f6607-kube-api-access-qptcn\") pod \"oauth-openshift-848cd9b885-hcbh9\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:24.087765 master-0 kubenswrapper[15202]: I0319 09:27:24.087694 15202 generic.go:334] "Generic (PLEG): container finished" podID="c656b2f4-785b-4403-b56a-637656900f07" containerID="618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00" exitCode=0 Mar 19 09:27:24.087989 master-0 kubenswrapper[15202]: I0319 09:27:24.087775 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" event={"ID":"c656b2f4-785b-4403-b56a-637656900f07","Type":"ContainerDied","Data":"618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00"} Mar 19 09:27:24.087989 master-0 kubenswrapper[15202]: I0319 09:27:24.087807 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" event={"ID":"c656b2f4-785b-4403-b56a-637656900f07","Type":"ContainerDied","Data":"525b8c67ee6f67dd3a96af91aefd1b8871e8d6e92ee92d0c0fe811c2fe8afb8a"} Mar 19 09:27:24.087989 master-0 kubenswrapper[15202]: I0319 09:27:24.087827 15202 scope.go:117] "RemoveContainer" containerID="618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00" Mar 19 09:27:24.087989 master-0 kubenswrapper[15202]: I0319 09:27:24.087875 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-967b7967b-mb725" Mar 19 09:27:24.090214 master-0 kubenswrapper[15202]: I0319 09:27:24.090180 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-7n68q" event={"ID":"1dc7476c-75a8-40fe-93f7-fca31aa2ebda","Type":"ContainerStarted","Data":"6065bc6a9f81c34c6e6f9e03f51356c665f37159e2abc1a949b2c18c0f8b451e"} Mar 19 09:27:24.102508 master-0 kubenswrapper[15202]: I0319 09:27:24.102429 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33" exitCode=0 Mar 19 09:27:24.102883 master-0 kubenswrapper[15202]: I0319 09:27:24.102510 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33"} Mar 19 09:27:24.102883 master-0 kubenswrapper[15202]: I0319 09:27:24.102561 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"72318015826a9a6df10b0ba1b3beedcda5e582b26dc3e68735c4af6fa4b42d32"} Mar 19 09:27:24.104266 master-0 kubenswrapper[15202]: I0319 09:27:24.104241 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:24.108725 master-0 kubenswrapper[15202]: I0319 09:27:24.108683 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1"} Mar 19 09:27:25.141629 master-0 kubenswrapper[15202]: I0319 09:27:25.141401 15202 scope.go:117] "RemoveContainer" containerID="618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00" Mar 19 09:27:25.142148 master-0 kubenswrapper[15202]: E0319 09:27:25.142000 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00\": container with ID starting with 618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00 not found: ID does not exist" containerID="618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00" Mar 19 09:27:25.142148 master-0 kubenswrapper[15202]: I0319 09:27:25.142098 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00"} err="failed to get container status \"618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00\": rpc error: code = NotFound desc = could not find container \"618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00\": container with ID starting with 618e0ea6ec4e6a029f094550eb29eadd009088bf42cfe76a4177f56f50e0ce00 not found: ID does not exist" Mar 19 09:27:25.701238 master-0 kubenswrapper[15202]: I0319 09:27:25.696478 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-848cd9b885-hcbh9"] Mar 19 09:27:25.717254 master-0 kubenswrapper[15202]: W0319 09:27:25.717197 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ca4075f_8b54_49da_a2b8_6a22801f6607.slice/crio-4a15f628112cd55d1202ebeef0b370f3a8aa4af37f6ffa1ca0c4349955430188 WatchSource:0}: Error finding container 4a15f628112cd55d1202ebeef0b370f3a8aa4af37f6ffa1ca0c4349955430188: Status 404 returned error can't find the container with id 4a15f628112cd55d1202ebeef0b370f3a8aa4af37f6ffa1ca0c4349955430188 Mar 19 09:27:25.780072 master-0 kubenswrapper[15202]: I0319 09:27:25.780037 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-967b7967b-mb725"] Mar 19 09:27:25.797979 master-0 kubenswrapper[15202]: I0319 09:27:25.797896 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-967b7967b-mb725"] Mar 19 09:27:26.149033 master-0 kubenswrapper[15202]: I0319 09:27:26.147953 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de"} Mar 19 09:27:26.149033 master-0 kubenswrapper[15202]: I0319 09:27:26.148005 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0"} Mar 19 09:27:26.149033 master-0 kubenswrapper[15202]: I0319 09:27:26.148015 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551"} Mar 19 09:27:26.151440 master-0 kubenswrapper[15202]: I0319 09:27:26.151402 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"59ba5767c9544bd68555bd0acf7bcb2264d3b4bda42ad0b165a1243fb45880af"} Mar 19 09:27:26.151440 master-0 kubenswrapper[15202]: I0319 09:27:26.151434 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"2678fec8a618a84413e1f7e1a6690d904fc6d728df918b3a94672c06215711c1"} Mar 19 09:27:26.180744 master-0 kubenswrapper[15202]: I0319 09:27:26.174478 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" event={"ID":"2ca4075f-8b54-49da-a2b8-6a22801f6607","Type":"ContainerStarted","Data":"13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2"} Mar 19 09:27:26.180744 master-0 kubenswrapper[15202]: I0319 09:27:26.174546 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" event={"ID":"2ca4075f-8b54-49da-a2b8-6a22801f6607","Type":"ContainerStarted","Data":"4a15f628112cd55d1202ebeef0b370f3a8aa4af37f6ffa1ca0c4349955430188"} Mar 19 09:27:26.180744 master-0 kubenswrapper[15202]: I0319 09:27:26.175149 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:26.210645 master-0 kubenswrapper[15202]: I0319 09:27:26.210439 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" podStartSLOduration=30.210413837 podStartE2EDuration="30.210413837s" podCreationTimestamp="2026-03-19 09:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:26.208527569 +0000 UTC m=+163.593942385" watchObservedRunningTime="2026-03-19 09:27:26.210413837 +0000 UTC m=+163.595828663" Mar 19 09:27:26.421167 master-0 kubenswrapper[15202]: I0319 09:27:26.421092 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:27:26.824300 master-0 kubenswrapper[15202]: I0319 09:27:26.824131 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c656b2f4-785b-4403-b56a-637656900f07" path="/var/lib/kubelet/pods/c656b2f4-785b-4403-b56a-637656900f07/volumes" Mar 19 09:27:27.190810 master-0 kubenswrapper[15202]: I0319 09:27:27.190719 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" event={"ID":"8e8163b1-50e4-4fa7-9324-fe74c24549de","Type":"ContainerStarted","Data":"8f9595479e71379a7ff2aec2edb1c2e3ec4b1b4f837573cd4bdcab50dcd58713"} Mar 19 09:27:27.192884 master-0 kubenswrapper[15202]: I0319 09:27:27.192314 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:27.197301 master-0 kubenswrapper[15202]: I0319 09:27:27.197224 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3"} Mar 19 09:27:27.197301 master-0 kubenswrapper[15202]: I0319 09:27:27.197281 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerStarted","Data":"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf"} Mar 19 09:27:27.198374 master-0 kubenswrapper[15202]: I0319 09:27:27.198335 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" Mar 19 09:27:27.240492 master-0 kubenswrapper[15202]: I0319 09:27:27.230374 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7d7bcd498-w2pfb" podStartSLOduration=8.139496801 podStartE2EDuration="17.230342462s" podCreationTimestamp="2026-03-19 09:27:10 +0000 UTC" firstStartedPulling="2026-03-19 09:27:16.647294936 +0000 UTC m=+154.032709752" lastFinishedPulling="2026-03-19 09:27:25.738140597 +0000 UTC m=+163.123555413" observedRunningTime="2026-03-19 09:27:27.22398367 +0000 UTC m=+164.609398486" watchObservedRunningTime="2026-03-19 09:27:27.230342462 +0000 UTC m=+164.615757298" Mar 19 09:27:27.268396 master-0 kubenswrapper[15202]: I0319 09:27:27.268088 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=5.523227513 podStartE2EDuration="28.26806719s" podCreationTimestamp="2026-03-19 09:26:59 +0000 UTC" firstStartedPulling="2026-03-19 09:27:00.958028046 +0000 UTC m=+138.343442862" lastFinishedPulling="2026-03-19 09:27:23.702867723 +0000 UTC m=+161.088282539" observedRunningTime="2026-03-19 09:27:27.265893335 +0000 UTC m=+164.651308151" watchObservedRunningTime="2026-03-19 09:27:27.26806719 +0000 UTC m=+164.653482006" Mar 19 09:27:31.247942 master-0 kubenswrapper[15202]: I0319 09:27:31.247804 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2"} Mar 19 09:27:31.247942 master-0 kubenswrapper[15202]: I0319 09:27:31.247879 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8"} Mar 19 09:27:31.247942 master-0 kubenswrapper[15202]: I0319 09:27:31.247895 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b"} Mar 19 09:27:31.247942 master-0 kubenswrapper[15202]: I0319 09:27:31.247910 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e"} Mar 19 09:27:31.296674 master-0 kubenswrapper[15202]: I0319 09:27:31.296604 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cdc9755cd-fl679"] Mar 19 09:27:31.301372 master-0 kubenswrapper[15202]: I0319 09:27:31.297553 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.304002 master-0 kubenswrapper[15202]: I0319 09:27:31.303959 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:27:31.304289 master-0 kubenswrapper[15202]: I0319 09:27:31.304268 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:27:31.304453 master-0 kubenswrapper[15202]: I0319 09:27:31.304435 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:27:31.304593 master-0 kubenswrapper[15202]: I0319 09:27:31.304571 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:27:31.304693 master-0 kubenswrapper[15202]: I0319 09:27:31.304664 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:27:31.328629 master-0 kubenswrapper[15202]: I0319 09:27:31.328579 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdc9755cd-fl679"] Mar 19 09:27:31.436950 master-0 kubenswrapper[15202]: I0319 09:27:31.436905 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-serving-cert\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.437085 master-0 kubenswrapper[15202]: I0319 09:27:31.436965 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-oauth-serving-cert\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.437085 master-0 kubenswrapper[15202]: I0319 09:27:31.437059 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtnfz\" (UniqueName: \"kubernetes.io/projected/46339f4c-f550-4303-b237-4014572b69c1-kube-api-access-dtnfz\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.437165 master-0 kubenswrapper[15202]: I0319 09:27:31.437110 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-oauth-config\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.437165 master-0 kubenswrapper[15202]: I0319 09:27:31.437130 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-console-config\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.437223 master-0 kubenswrapper[15202]: I0319 09:27:31.437198 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-service-ca\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.538272 master-0 kubenswrapper[15202]: I0319 09:27:31.538212 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-oauth-serving-cert\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.538529 master-0 kubenswrapper[15202]: I0319 09:27:31.538300 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtnfz\" (UniqueName: \"kubernetes.io/projected/46339f4c-f550-4303-b237-4014572b69c1-kube-api-access-dtnfz\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.538529 master-0 kubenswrapper[15202]: I0319 09:27:31.538353 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-oauth-config\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.538529 master-0 kubenswrapper[15202]: I0319 09:27:31.538391 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-console-config\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.538529 master-0 kubenswrapper[15202]: I0319 09:27:31.538431 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-service-ca\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.538529 master-0 kubenswrapper[15202]: I0319 09:27:31.538478 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-serving-cert\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.539962 master-0 kubenswrapper[15202]: I0319 09:27:31.539889 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-oauth-serving-cert\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.541415 master-0 kubenswrapper[15202]: I0319 09:27:31.540289 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-console-config\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.541415 master-0 kubenswrapper[15202]: I0319 09:27:31.540739 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-service-ca\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.542852 master-0 kubenswrapper[15202]: I0319 09:27:31.542304 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-serving-cert\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.554269 master-0 kubenswrapper[15202]: I0319 09:27:31.554164 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-oauth-config\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.556564 master-0 kubenswrapper[15202]: I0319 09:27:31.556528 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtnfz\" (UniqueName: \"kubernetes.io/projected/46339f4c-f550-4303-b237-4014572b69c1-kube-api-access-dtnfz\") pod \"console-cdc9755cd-fl679\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:31.655240 master-0 kubenswrapper[15202]: I0319 09:27:31.655087 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:32.128142 master-0 kubenswrapper[15202]: I0319 09:27:32.128098 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdc9755cd-fl679"] Mar 19 09:27:32.134070 master-0 kubenswrapper[15202]: W0319 09:27:32.134038 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46339f4c_f550_4303_b237_4014572b69c1.slice/crio-ddd0d08da1a19bd9635f3f69eb963eb666004399b604865aa340a8d2bd1f9bfb WatchSource:0}: Error finding container ddd0d08da1a19bd9635f3f69eb963eb666004399b604865aa340a8d2bd1f9bfb: Status 404 returned error can't find the container with id ddd0d08da1a19bd9635f3f69eb963eb666004399b604865aa340a8d2bd1f9bfb Mar 19 09:27:32.265093 master-0 kubenswrapper[15202]: I0319 09:27:32.265032 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a"} Mar 19 09:27:32.265093 master-0 kubenswrapper[15202]: I0319 09:27:32.265094 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerStarted","Data":"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2"} Mar 19 09:27:32.266704 master-0 kubenswrapper[15202]: I0319 09:27:32.266430 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdc9755cd-fl679" event={"ID":"46339f4c-f550-4303-b237-4014572b69c1","Type":"ContainerStarted","Data":"ddd0d08da1a19bd9635f3f69eb963eb666004399b604865aa340a8d2bd1f9bfb"} Mar 19 09:27:32.308330 master-0 kubenswrapper[15202]: I0319 09:27:32.308195 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.109872269 podStartE2EDuration="11.308175231s" podCreationTimestamp="2026-03-19 09:27:21 +0000 UTC" firstStartedPulling="2026-03-19 09:27:24.104632486 +0000 UTC m=+161.490047292" lastFinishedPulling="2026-03-19 09:27:30.302935438 +0000 UTC m=+167.688350254" observedRunningTime="2026-03-19 09:27:32.30266933 +0000 UTC m=+169.688084146" watchObservedRunningTime="2026-03-19 09:27:32.308175231 +0000 UTC m=+169.693590047" Mar 19 09:27:32.964408 master-0 kubenswrapper[15202]: I0319 09:27:32.964343 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:27:32.965687 master-0 kubenswrapper[15202]: I0319 09:27:32.965664 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:32.968017 master-0 kubenswrapper[15202]: I0319 09:27:32.967971 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:27:32.968204 master-0 kubenswrapper[15202]: I0319 09:27:32.968186 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-w5d24" Mar 19 09:27:32.980646 master-0 kubenswrapper[15202]: I0319 09:27:32.980593 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:27:33.070395 master-0 kubenswrapper[15202]: I0319 09:27:33.070322 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.070657 master-0 kubenswrapper[15202]: I0319 09:27:33.070427 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-var-lock\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.070657 master-0 kubenswrapper[15202]: I0319 09:27:33.070446 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.172326 master-0 kubenswrapper[15202]: I0319 09:27:33.172264 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.172567 master-0 kubenswrapper[15202]: I0319 09:27:33.172402 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.172567 master-0 kubenswrapper[15202]: I0319 09:27:33.172436 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-var-lock\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.172673 master-0 kubenswrapper[15202]: I0319 09:27:33.172586 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-var-lock\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.172724 master-0 kubenswrapper[15202]: I0319 09:27:33.172686 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.192424 master-0 kubenswrapper[15202]: I0319 09:27:33.192377 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.292231 master-0 kubenswrapper[15202]: I0319 09:27:33.292083 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:27:33.808220 master-0 kubenswrapper[15202]: I0319 09:27:33.808148 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:27:34.288933 master-0 kubenswrapper[15202]: I0319 09:27:34.288862 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a407e4e7-02ce-4b86-8314-1f3cccc10ccf","Type":"ContainerStarted","Data":"5c0aaa8a6778afccc22d6e79f1e9adc3cfe1b83a454c94fb1a653599ba2bc0a6"} Mar 19 09:27:34.288933 master-0 kubenswrapper[15202]: I0319 09:27:34.288920 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a407e4e7-02ce-4b86-8314-1f3cccc10ccf","Type":"ContainerStarted","Data":"41e708b7d40d47362e699214b9959b220d7ce3d621224cf5042efa6f4650c5f5"} Mar 19 09:27:34.311139 master-0 kubenswrapper[15202]: I0319 09:27:34.310658 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.310635922 podStartE2EDuration="2.310635922s" podCreationTimestamp="2026-03-19 09:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:34.308758865 +0000 UTC m=+171.694173681" watchObservedRunningTime="2026-03-19 09:27:34.310635922 +0000 UTC m=+171.696050738" Mar 19 09:27:35.457061 master-0 kubenswrapper[15202]: I0319 09:27:35.457003 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:35.457061 master-0 kubenswrapper[15202]: I0319 09:27:35.457060 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:36.859179 master-0 kubenswrapper[15202]: I0319 09:27:36.859015 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:27:37.321033 master-0 kubenswrapper[15202]: I0319 09:27:37.320907 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdc9755cd-fl679" event={"ID":"46339f4c-f550-4303-b237-4014572b69c1","Type":"ContainerStarted","Data":"a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911"} Mar 19 09:27:37.349810 master-0 kubenswrapper[15202]: I0319 09:27:37.349682 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cdc9755cd-fl679" podStartSLOduration=1.949845007 podStartE2EDuration="6.349655047s" podCreationTimestamp="2026-03-19 09:27:31 +0000 UTC" firstStartedPulling="2026-03-19 09:27:32.136596481 +0000 UTC m=+169.522011297" lastFinishedPulling="2026-03-19 09:27:36.536406521 +0000 UTC m=+173.921821337" observedRunningTime="2026-03-19 09:27:37.34394538 +0000 UTC m=+174.729360216" watchObservedRunningTime="2026-03-19 09:27:37.349655047 +0000 UTC m=+174.735069863" Mar 19 09:27:40.270744 master-0 kubenswrapper[15202]: I0319 09:27:40.270640 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-697d79fb97-jrvk4"] Mar 19 09:27:40.271889 master-0 kubenswrapper[15202]: I0319 09:27:40.271853 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.281485 master-0 kubenswrapper[15202]: I0319 09:27:40.281411 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:27:40.282707 master-0 kubenswrapper[15202]: I0319 09:27:40.282668 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-697d79fb97-jrvk4"] Mar 19 09:27:40.418065 master-0 kubenswrapper[15202]: I0319 09:27:40.418013 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-serving-cert\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.418304 master-0 kubenswrapper[15202]: I0319 09:27:40.418073 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-oauth-serving-cert\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.418304 master-0 kubenswrapper[15202]: I0319 09:27:40.418103 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-service-ca\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.418454 master-0 kubenswrapper[15202]: I0319 09:27:40.418349 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-oauth-config\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.418538 master-0 kubenswrapper[15202]: I0319 09:27:40.418499 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-trusted-ca-bundle\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.418586 master-0 kubenswrapper[15202]: I0319 09:27:40.418541 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94j6\" (UniqueName: \"kubernetes.io/projected/8157e508-83eb-416e-9c10-f193cd4dbd53-kube-api-access-h94j6\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.418877 master-0 kubenswrapper[15202]: I0319 09:27:40.418809 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-console-config\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524618 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-oauth-config\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524698 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-trusted-ca-bundle\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524729 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h94j6\" (UniqueName: \"kubernetes.io/projected/8157e508-83eb-416e-9c10-f193cd4dbd53-kube-api-access-h94j6\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524794 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-console-config\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524857 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-serving-cert\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524888 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-oauth-serving-cert\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.524915 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-service-ca\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.526949 master-0 kubenswrapper[15202]: I0319 09:27:40.526302 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-service-ca\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.533244 master-0 kubenswrapper[15202]: I0319 09:27:40.533202 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-oauth-config\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.536618 master-0 kubenswrapper[15202]: I0319 09:27:40.533545 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-trusted-ca-bundle\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.536618 master-0 kubenswrapper[15202]: I0319 09:27:40.534258 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-console-config\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.536618 master-0 kubenswrapper[15202]: I0319 09:27:40.535003 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-oauth-serving-cert\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.541696 master-0 kubenswrapper[15202]: I0319 09:27:40.541654 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-serving-cert\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.561759 master-0 kubenswrapper[15202]: I0319 09:27:40.561708 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94j6\" (UniqueName: \"kubernetes.io/projected/8157e508-83eb-416e-9c10-f193cd4dbd53-kube-api-access-h94j6\") pod \"console-697d79fb97-jrvk4\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:40.601420 master-0 kubenswrapper[15202]: I0319 09:27:40.601343 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:41.196349 master-0 kubenswrapper[15202]: I0319 09:27:41.195921 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-697d79fb97-jrvk4"] Mar 19 09:27:41.201582 master-0 kubenswrapper[15202]: W0319 09:27:41.201477 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8157e508_83eb_416e_9c10_f193cd4dbd53.slice/crio-d64ac5047f232dcbfc61a55bd66d19afa32b52101b07d274dfb77365432ce423 WatchSource:0}: Error finding container d64ac5047f232dcbfc61a55bd66d19afa32b52101b07d274dfb77365432ce423: Status 404 returned error can't find the container with id d64ac5047f232dcbfc61a55bd66d19afa32b52101b07d274dfb77365432ce423 Mar 19 09:27:41.350597 master-0 kubenswrapper[15202]: I0319 09:27:41.350521 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697d79fb97-jrvk4" event={"ID":"8157e508-83eb-416e-9c10-f193cd4dbd53","Type":"ContainerStarted","Data":"4088813fc22eecbc208070992edf1790b236e8a23cc85474648baf7d6dc7ecb8"} Mar 19 09:27:41.350597 master-0 kubenswrapper[15202]: I0319 09:27:41.350576 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697d79fb97-jrvk4" event={"ID":"8157e508-83eb-416e-9c10-f193cd4dbd53","Type":"ContainerStarted","Data":"d64ac5047f232dcbfc61a55bd66d19afa32b52101b07d274dfb77365432ce423"} Mar 19 09:27:41.374305 master-0 kubenswrapper[15202]: I0319 09:27:41.374225 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-697d79fb97-jrvk4" podStartSLOduration=1.374200483 podStartE2EDuration="1.374200483s" podCreationTimestamp="2026-03-19 09:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:41.37248767 +0000 UTC m=+178.757902496" watchObservedRunningTime="2026-03-19 09:27:41.374200483 +0000 UTC m=+178.759615299" Mar 19 09:27:41.656036 master-0 kubenswrapper[15202]: I0319 09:27:41.655850 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:41.656036 master-0 kubenswrapper[15202]: I0319 09:27:41.655939 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:27:41.659500 master-0 kubenswrapper[15202]: I0319 09:27:41.658886 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:27:41.659500 master-0 kubenswrapper[15202]: I0319 09:27:41.658940 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:27:41.997391 master-0 kubenswrapper[15202]: I0319 09:27:41.997325 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-848cd9b885-hcbh9"] Mar 19 09:27:46.574555 master-0 kubenswrapper[15202]: I0319 09:27:46.571037 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:27:46.574555 master-0 kubenswrapper[15202]: I0319 09:27:46.571309 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-3-master-0" podUID="a407e4e7-02ce-4b86-8314-1f3cccc10ccf" containerName="installer" containerID="cri-o://5c0aaa8a6778afccc22d6e79f1e9adc3cfe1b83a454c94fb1a653599ba2bc0a6" gracePeriod=30 Mar 19 09:27:49.765702 master-0 kubenswrapper[15202]: I0319 09:27:49.765617 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:27:49.767138 master-0 kubenswrapper[15202]: I0319 09:27:49.766862 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:49.775797 master-0 kubenswrapper[15202]: I0319 09:27:49.775740 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:27:49.897695 master-0 kubenswrapper[15202]: I0319 09:27:49.897583 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86b617be-3cae-4fea-bba7-199de3e4ecf6-kube-api-access\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:49.897695 master-0 kubenswrapper[15202]: I0319 09:27:49.897662 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:49.897976 master-0 kubenswrapper[15202]: I0319 09:27:49.897722 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-var-lock\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:49.999717 master-0 kubenswrapper[15202]: I0319 09:27:49.999600 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86b617be-3cae-4fea-bba7-199de3e4ecf6-kube-api-access\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:49.999717 master-0 kubenswrapper[15202]: I0319 09:27:49.999692 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:49.999717 master-0 kubenswrapper[15202]: I0319 09:27:49.999721 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-var-lock\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:50.000161 master-0 kubenswrapper[15202]: I0319 09:27:49.999852 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-var-lock\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:50.000300 master-0 kubenswrapper[15202]: I0319 09:27:50.000244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:50.016797 master-0 kubenswrapper[15202]: I0319 09:27:50.016698 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86b617be-3cae-4fea-bba7-199de3e4ecf6-kube-api-access\") pod \"installer-4-master-0\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:50.145346 master-0 kubenswrapper[15202]: I0319 09:27:50.145231 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:27:50.602420 master-0 kubenswrapper[15202]: I0319 09:27:50.602349 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:50.602420 master-0 kubenswrapper[15202]: I0319 09:27:50.602407 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:27:50.604756 master-0 kubenswrapper[15202]: I0319 09:27:50.604709 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:27:50.605021 master-0 kubenswrapper[15202]: I0319 09:27:50.604757 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:27:51.655883 master-0 kubenswrapper[15202]: I0319 09:27:51.655780 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:27:51.656517 master-0 kubenswrapper[15202]: I0319 09:27:51.655888 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:27:55.464649 master-0 kubenswrapper[15202]: I0319 09:27:55.464574 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:27:55.468726 master-0 kubenswrapper[15202]: I0319 09:27:55.468663 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8c858dd9d-j8mx9" Mar 19 09:28:00.602702 master-0 kubenswrapper[15202]: I0319 09:28:00.602658 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:28:00.603273 master-0 kubenswrapper[15202]: I0319 09:28:00.603241 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:28:01.656714 master-0 kubenswrapper[15202]: I0319 09:28:01.656639 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:28:01.656714 master-0 kubenswrapper[15202]: I0319 09:28:01.656712 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:28:05.954266 master-0 kubenswrapper[15202]: E0319 09:28:05.954204 15202 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-poda407e4e7_02ce_4b86_8314_1f3cccc10ccf.slice/crio-conmon-5c0aaa8a6778afccc22d6e79f1e9adc3cfe1b83a454c94fb1a653599ba2bc0a6.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:28:06.545907 master-0 kubenswrapper[15202]: I0319 09:28:06.545842 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_a407e4e7-02ce-4b86-8314-1f3cccc10ccf/installer/0.log" Mar 19 09:28:06.545907 master-0 kubenswrapper[15202]: I0319 09:28:06.545907 15202 generic.go:334] "Generic (PLEG): container finished" podID="a407e4e7-02ce-4b86-8314-1f3cccc10ccf" containerID="5c0aaa8a6778afccc22d6e79f1e9adc3cfe1b83a454c94fb1a653599ba2bc0a6" exitCode=1 Mar 19 09:28:06.546338 master-0 kubenswrapper[15202]: I0319 09:28:06.545962 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a407e4e7-02ce-4b86-8314-1f3cccc10ccf","Type":"ContainerDied","Data":"5c0aaa8a6778afccc22d6e79f1e9adc3cfe1b83a454c94fb1a653599ba2bc0a6"} Mar 19 09:28:06.707890 master-0 kubenswrapper[15202]: I0319 09:28:06.705829 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 09:28:07.017430 master-0 kubenswrapper[15202]: I0319 09:28:07.017364 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_a407e4e7-02ce-4b86-8314-1f3cccc10ccf/installer/0.log" Mar 19 09:28:07.017980 master-0 kubenswrapper[15202]: I0319 09:28:07.017528 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:28:07.021911 master-0 kubenswrapper[15202]: I0319 09:28:07.021833 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" podUID="2ca4075f-8b54-49da-a2b8-6a22801f6607" containerName="oauth-openshift" containerID="cri-o://13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2" gracePeriod=15 Mar 19 09:28:07.144310 master-0 kubenswrapper[15202]: I0319 09:28:07.144240 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kube-api-access\") pod \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " Mar 19 09:28:07.144708 master-0 kubenswrapper[15202]: I0319 09:28:07.144378 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kubelet-dir\") pod \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " Mar 19 09:28:07.144708 master-0 kubenswrapper[15202]: I0319 09:28:07.144451 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-var-lock\") pod \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\" (UID: \"a407e4e7-02ce-4b86-8314-1f3cccc10ccf\") " Mar 19 09:28:07.149843 master-0 kubenswrapper[15202]: I0319 09:28:07.144901 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-var-lock" (OuterVolumeSpecName: "var-lock") pod "a407e4e7-02ce-4b86-8314-1f3cccc10ccf" (UID: "a407e4e7-02ce-4b86-8314-1f3cccc10ccf"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:07.149843 master-0 kubenswrapper[15202]: I0319 09:28:07.147885 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a407e4e7-02ce-4b86-8314-1f3cccc10ccf" (UID: "a407e4e7-02ce-4b86-8314-1f3cccc10ccf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:07.156414 master-0 kubenswrapper[15202]: I0319 09:28:07.156324 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a407e4e7-02ce-4b86-8314-1f3cccc10ccf" (UID: "a407e4e7-02ce-4b86-8314-1f3cccc10ccf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:07.247268 master-0 kubenswrapper[15202]: I0319 09:28:07.247201 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.247268 master-0 kubenswrapper[15202]: I0319 09:28:07.247254 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.247268 master-0 kubenswrapper[15202]: I0319 09:28:07.247276 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a407e4e7-02ce-4b86-8314-1f3cccc10ccf-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.541547 master-0 kubenswrapper[15202]: I0319 09:28:07.541485 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:28:07.554855 master-0 kubenswrapper[15202]: I0319 09:28:07.554763 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-7n68q" event={"ID":"1dc7476c-75a8-40fe-93f7-fca31aa2ebda","Type":"ContainerStarted","Data":"bf3a84f7f1f4aca6d9a11344ddc675127817d94e29c0e0153abb263ced9d8414"} Mar 19 09:28:07.556457 master-0 kubenswrapper[15202]: I0319 09:28:07.556355 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:28:07.557637 master-0 kubenswrapper[15202]: I0319 09:28:07.557582 15202 patch_prober.go:28] interesting pod/downloads-66b8ffb895-7n68q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Mar 19 09:28:07.557722 master-0 kubenswrapper[15202]: I0319 09:28:07.557666 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-7n68q" podUID="1dc7476c-75a8-40fe-93f7-fca31aa2ebda" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Mar 19 09:28:07.559677 master-0 kubenswrapper[15202]: I0319 09:28:07.559356 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"86b617be-3cae-4fea-bba7-199de3e4ecf6","Type":"ContainerStarted","Data":"355cd89078453b5d440100fc82d5481b6a468e22fd22088f1a704fff3b57820e"} Mar 19 09:28:07.559763 master-0 kubenswrapper[15202]: I0319 09:28:07.559684 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"86b617be-3cae-4fea-bba7-199de3e4ecf6","Type":"ContainerStarted","Data":"bf817d255fbe35689e89ade319171e62303ff41aeeae5c5e4b687a11f31215bd"} Mar 19 09:28:07.561015 master-0 kubenswrapper[15202]: I0319 09:28:07.560972 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_a407e4e7-02ce-4b86-8314-1f3cccc10ccf/installer/0.log" Mar 19 09:28:07.561094 master-0 kubenswrapper[15202]: I0319 09:28:07.561060 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a407e4e7-02ce-4b86-8314-1f3cccc10ccf","Type":"ContainerDied","Data":"41e708b7d40d47362e699214b9959b220d7ce3d621224cf5042efa6f4650c5f5"} Mar 19 09:28:07.561139 master-0 kubenswrapper[15202]: I0319 09:28:07.561107 15202 scope.go:117] "RemoveContainer" containerID="5c0aaa8a6778afccc22d6e79f1e9adc3cfe1b83a454c94fb1a653599ba2bc0a6" Mar 19 09:28:07.561193 master-0 kubenswrapper[15202]: I0319 09:28:07.561160 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 09:28:07.563647 master-0 kubenswrapper[15202]: I0319 09:28:07.563588 15202 generic.go:334] "Generic (PLEG): container finished" podID="2ca4075f-8b54-49da-a2b8-6a22801f6607" containerID="13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2" exitCode=0 Mar 19 09:28:07.563726 master-0 kubenswrapper[15202]: I0319 09:28:07.563647 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" event={"ID":"2ca4075f-8b54-49da-a2b8-6a22801f6607","Type":"ContainerDied","Data":"13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2"} Mar 19 09:28:07.563726 master-0 kubenswrapper[15202]: I0319 09:28:07.563686 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" event={"ID":"2ca4075f-8b54-49da-a2b8-6a22801f6607","Type":"ContainerDied","Data":"4a15f628112cd55d1202ebeef0b370f3a8aa4af37f6ffa1ca0c4349955430188"} Mar 19 09:28:07.563726 master-0 kubenswrapper[15202]: I0319 09:28:07.563656 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-848cd9b885-hcbh9" Mar 19 09:28:07.614105 master-0 kubenswrapper[15202]: I0319 09:28:07.612830 15202 scope.go:117] "RemoveContainer" containerID="13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2" Mar 19 09:28:07.634270 master-0 kubenswrapper[15202]: I0319 09:28:07.634166 15202 scope.go:117] "RemoveContainer" containerID="13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2" Mar 19 09:28:07.635192 master-0 kubenswrapper[15202]: E0319 09:28:07.635136 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2\": container with ID starting with 13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2 not found: ID does not exist" containerID="13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2" Mar 19 09:28:07.635364 master-0 kubenswrapper[15202]: I0319 09:28:07.635320 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2"} err="failed to get container status \"13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2\": rpc error: code = NotFound desc = could not find container \"13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2\": container with ID starting with 13e247019717d5fac09002439bef49a8dd1e6ed6738e5c208b9017ffa8397ec2 not found: ID does not exist" Mar 19 09:28:07.667536 master-0 kubenswrapper[15202]: I0319 09:28:07.667404 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-serving-cert\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.667977 master-0 kubenswrapper[15202]: I0319 09:28:07.667955 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-router-certs\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668132 master-0 kubenswrapper[15202]: I0319 09:28:07.668114 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-cliconfig\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668229 master-0 kubenswrapper[15202]: I0319 09:28:07.668212 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-policies\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668375 master-0 kubenswrapper[15202]: I0319 09:28:07.668357 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qptcn\" (UniqueName: \"kubernetes.io/projected/2ca4075f-8b54-49da-a2b8-6a22801f6607-kube-api-access-qptcn\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668541 master-0 kubenswrapper[15202]: I0319 09:28:07.668527 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-login\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668639 master-0 kubenswrapper[15202]: I0319 09:28:07.668625 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-service-ca\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668751 master-0 kubenswrapper[15202]: I0319 09:28:07.668736 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-session\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668854 master-0 kubenswrapper[15202]: I0319 09:28:07.668841 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-trusted-ca-bundle\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.668943 master-0 kubenswrapper[15202]: I0319 09:28:07.668927 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-ocp-branding-template\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.669038 master-0 kubenswrapper[15202]: I0319 09:28:07.669023 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-error\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.669132 master-0 kubenswrapper[15202]: I0319 09:28:07.669117 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-provider-selection\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.669216 master-0 kubenswrapper[15202]: I0319 09:28:07.669203 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-dir\") pod \"2ca4075f-8b54-49da-a2b8-6a22801f6607\" (UID: \"2ca4075f-8b54-49da-a2b8-6a22801f6607\") " Mar 19 09:28:07.669303 master-0 kubenswrapper[15202]: I0319 09:28:07.668672 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:07.669349 master-0 kubenswrapper[15202]: I0319 09:28:07.669050 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:07.669349 master-0 kubenswrapper[15202]: I0319 09:28:07.669315 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:07.669896 master-0 kubenswrapper[15202]: I0319 09:28:07.669878 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.669974 master-0 kubenswrapper[15202]: I0319 09:28:07.669961 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.670040 master-0 kubenswrapper[15202]: I0319 09:28:07.670029 15202 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.671402 master-0 kubenswrapper[15202]: I0319 09:28:07.671053 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:07.671402 master-0 kubenswrapper[15202]: I0319 09:28:07.671157 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:07.677924 master-0 kubenswrapper[15202]: I0319 09:28:07.677847 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca4075f-8b54-49da-a2b8-6a22801f6607-kube-api-access-qptcn" (OuterVolumeSpecName: "kube-api-access-qptcn") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "kube-api-access-qptcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:07.677924 master-0 kubenswrapper[15202]: I0319 09:28:07.677841 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.678401 master-0 kubenswrapper[15202]: I0319 09:28:07.678342 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.679587 master-0 kubenswrapper[15202]: I0319 09:28:07.678767 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.679670 master-0 kubenswrapper[15202]: I0319 09:28:07.679221 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.679741 master-0 kubenswrapper[15202]: I0319 09:28:07.679619 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.680572 master-0 kubenswrapper[15202]: I0319 09:28:07.680431 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.681734 master-0 kubenswrapper[15202]: I0319 09:28:07.681640 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2ca4075f-8b54-49da-a2b8-6a22801f6607" (UID: "2ca4075f-8b54-49da-a2b8-6a22801f6607"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:07.729714 master-0 kubenswrapper[15202]: I0319 09:28:07.728671 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=18.728646788 podStartE2EDuration="18.728646788s" podCreationTimestamp="2026-03-19 09:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:07.725112528 +0000 UTC m=+205.110527344" watchObservedRunningTime="2026-03-19 09:28:07.728646788 +0000 UTC m=+205.114061604" Mar 19 09:28:07.774001 master-0 kubenswrapper[15202]: I0319 09:28:07.773920 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:28:07.774727 master-0 kubenswrapper[15202]: I0319 09:28:07.774673 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qptcn\" (UniqueName: \"kubernetes.io/projected/2ca4075f-8b54-49da-a2b8-6a22801f6607-kube-api-access-qptcn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.774824 master-0 kubenswrapper[15202]: I0319 09:28:07.774806 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.774917 master-0 kubenswrapper[15202]: I0319 09:28:07.774904 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775006 master-0 kubenswrapper[15202]: I0319 09:28:07.774990 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775094 master-0 kubenswrapper[15202]: I0319 09:28:07.775077 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775178 master-0 kubenswrapper[15202]: I0319 09:28:07.775163 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775266 master-0 kubenswrapper[15202]: I0319 09:28:07.775252 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775355 master-0 kubenswrapper[15202]: I0319 09:28:07.775339 15202 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca4075f-8b54-49da-a2b8-6a22801f6607-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775434 master-0 kubenswrapper[15202]: I0319 09:28:07.775421 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.775539 master-0 kubenswrapper[15202]: I0319 09:28:07.775524 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ca4075f-8b54-49da-a2b8-6a22801f6607-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:07.789319 master-0 kubenswrapper[15202]: I0319 09:28:07.789115 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 09:28:07.815331 master-0 kubenswrapper[15202]: I0319 09:28:07.815216 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-66b8ffb895-7n68q" podStartSLOduration=2.682599972 podStartE2EDuration="45.815196053s" podCreationTimestamp="2026-03-19 09:27:22 +0000 UTC" firstStartedPulling="2026-03-19 09:27:23.893817439 +0000 UTC m=+161.279232255" lastFinishedPulling="2026-03-19 09:28:07.02641352 +0000 UTC m=+204.411828336" observedRunningTime="2026-03-19 09:28:07.812280559 +0000 UTC m=+205.197695385" watchObservedRunningTime="2026-03-19 09:28:07.815196053 +0000 UTC m=+205.200610869" Mar 19 09:28:07.908919 master-0 kubenswrapper[15202]: I0319 09:28:07.908862 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-848cd9b885-hcbh9"] Mar 19 09:28:07.921247 master-0 kubenswrapper[15202]: I0319 09:28:07.921164 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-848cd9b885-hcbh9"] Mar 19 09:28:08.582038 master-0 kubenswrapper[15202]: I0319 09:28:08.581967 15202 patch_prober.go:28] interesting pod/downloads-66b8ffb895-7n68q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Mar 19 09:28:08.582038 master-0 kubenswrapper[15202]: I0319 09:28:08.582035 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-7n68q" podUID="1dc7476c-75a8-40fe-93f7-fca31aa2ebda" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Mar 19 09:28:08.825717 master-0 kubenswrapper[15202]: I0319 09:28:08.825611 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca4075f-8b54-49da-a2b8-6a22801f6607" path="/var/lib/kubelet/pods/2ca4075f-8b54-49da-a2b8-6a22801f6607/volumes" Mar 19 09:28:08.826417 master-0 kubenswrapper[15202]: I0319 09:28:08.826369 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a407e4e7-02ce-4b86-8314-1f3cccc10ccf" path="/var/lib/kubelet/pods/a407e4e7-02ce-4b86-8314-1f3cccc10ccf/volumes" Mar 19 09:28:09.445123 master-0 kubenswrapper[15202]: I0319 09:28:09.445039 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5455ddcb95-p88pn"] Mar 19 09:28:09.445586 master-0 kubenswrapper[15202]: E0319 09:28:09.445549 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a407e4e7-02ce-4b86-8314-1f3cccc10ccf" containerName="installer" Mar 19 09:28:09.445586 master-0 kubenswrapper[15202]: I0319 09:28:09.445580 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="a407e4e7-02ce-4b86-8314-1f3cccc10ccf" containerName="installer" Mar 19 09:28:09.445750 master-0 kubenswrapper[15202]: E0319 09:28:09.445711 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca4075f-8b54-49da-a2b8-6a22801f6607" containerName="oauth-openshift" Mar 19 09:28:09.445750 master-0 kubenswrapper[15202]: I0319 09:28:09.445725 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca4075f-8b54-49da-a2b8-6a22801f6607" containerName="oauth-openshift" Mar 19 09:28:09.445948 master-0 kubenswrapper[15202]: I0319 09:28:09.445912 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="a407e4e7-02ce-4b86-8314-1f3cccc10ccf" containerName="installer" Mar 19 09:28:09.447153 master-0 kubenswrapper[15202]: I0319 09:28:09.447112 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca4075f-8b54-49da-a2b8-6a22801f6607" containerName="oauth-openshift" Mar 19 09:28:09.448652 master-0 kubenswrapper[15202]: I0319 09:28:09.448614 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.456919 master-0 kubenswrapper[15202]: I0319 09:28:09.454631 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:28:09.456919 master-0 kubenswrapper[15202]: I0319 09:28:09.455039 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-2qc5w" Mar 19 09:28:09.456919 master-0 kubenswrapper[15202]: I0319 09:28:09.455597 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.457765 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458000 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458123 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458228 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458336 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458614 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458734 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:28:09.459381 master-0 kubenswrapper[15202]: I0319 09:28:09.458846 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:28:09.460087 master-0 kubenswrapper[15202]: I0319 09:28:09.460069 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:28:09.473679 master-0 kubenswrapper[15202]: I0319 09:28:09.473494 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:28:09.484792 master-0 kubenswrapper[15202]: I0319 09:28:09.484623 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:28:09.485840 master-0 kubenswrapper[15202]: I0319 09:28:09.485798 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5455ddcb95-p88pn"] Mar 19 09:28:09.510813 master-0 kubenswrapper[15202]: I0319 09:28:09.510724 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-session\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.510813 master-0 kubenswrapper[15202]: I0319 09:28:09.510787 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-router-certs\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.510813 master-0 kubenswrapper[15202]: I0319 09:28:09.510823 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-login\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.510875 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-dir\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.510908 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-service-ca\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.510927 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.510949 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-error\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.510980 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt4h9\" (UniqueName: \"kubernetes.io/projected/5c2d7253-f08b-4aa3-b728-6012da77f513-kube-api-access-xt4h9\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.511011 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.511036 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.511058 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-policies\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.511078 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.511290 master-0 kubenswrapper[15202]: I0319 09:28:09.511096 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.587935 master-0 kubenswrapper[15202]: I0319 09:28:09.587487 15202 patch_prober.go:28] interesting pod/downloads-66b8ffb895-7n68q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Mar 19 09:28:09.587935 master-0 kubenswrapper[15202]: I0319 09:28:09.587557 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-7n68q" podUID="1dc7476c-75a8-40fe-93f7-fca31aa2ebda" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Mar 19 09:28:09.613373 master-0 kubenswrapper[15202]: I0319 09:28:09.613284 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt4h9\" (UniqueName: \"kubernetes.io/projected/5c2d7253-f08b-4aa3-b728-6012da77f513-kube-api-access-xt4h9\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.613373 master-0 kubenswrapper[15202]: I0319 09:28:09.613379 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613432 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613459 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-policies\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613510 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613531 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613570 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-session\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613588 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-router-certs\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613617 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-login\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613661 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-dir\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613707 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-service-ca\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613731 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614009 master-0 kubenswrapper[15202]: I0319 09:28:09.613782 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-error\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.614560 master-0 kubenswrapper[15202]: I0319 09:28:09.614379 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-dir\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.618612 master-0 kubenswrapper[15202]: I0319 09:28:09.616613 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-policies\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.618612 master-0 kubenswrapper[15202]: I0319 09:28:09.618104 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.618612 master-0 kubenswrapper[15202]: I0319 09:28:09.618243 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.623696 master-0 kubenswrapper[15202]: I0319 09:28:09.623020 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-service-ca\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.623696 master-0 kubenswrapper[15202]: I0319 09:28:09.623112 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-error\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.630428 master-0 kubenswrapper[15202]: I0319 09:28:09.629513 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-router-certs\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.631327 master-0 kubenswrapper[15202]: I0319 09:28:09.630813 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.631327 master-0 kubenswrapper[15202]: I0319 09:28:09.631159 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.637927 master-0 kubenswrapper[15202]: I0319 09:28:09.633890 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-login\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.637927 master-0 kubenswrapper[15202]: I0319 09:28:09.634165 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-session\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.637927 master-0 kubenswrapper[15202]: I0319 09:28:09.634389 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.717574 master-0 kubenswrapper[15202]: I0319 09:28:09.717371 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt4h9\" (UniqueName: \"kubernetes.io/projected/5c2d7253-f08b-4aa3-b728-6012da77f513-kube-api-access-xt4h9\") pod \"oauth-openshift-5455ddcb95-p88pn\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:09.777546 master-0 kubenswrapper[15202]: I0319 09:28:09.777455 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:10.602972 master-0 kubenswrapper[15202]: I0319 09:28:10.602890 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:28:10.604127 master-0 kubenswrapper[15202]: I0319 09:28:10.604077 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:28:11.210229 master-0 kubenswrapper[15202]: I0319 09:28:11.210114 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5455ddcb95-p88pn"] Mar 19 09:28:11.605670 master-0 kubenswrapper[15202]: I0319 09:28:11.605458 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" event={"ID":"5c2d7253-f08b-4aa3-b728-6012da77f513","Type":"ContainerStarted","Data":"8a3d4063a2259d6d3f04875dc5120e12eb59eab6fac456381c57fc80ddc8110a"} Mar 19 09:28:11.657180 master-0 kubenswrapper[15202]: I0319 09:28:11.656524 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:28:11.657180 master-0 kubenswrapper[15202]: I0319 09:28:11.656627 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:28:13.242306 master-0 kubenswrapper[15202]: I0319 09:28:13.242180 15202 patch_prober.go:28] interesting pod/downloads-66b8ffb895-7n68q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Mar 19 09:28:13.243317 master-0 kubenswrapper[15202]: I0319 09:28:13.242356 15202 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-66b8ffb895-7n68q" podUID="1dc7476c-75a8-40fe-93f7-fca31aa2ebda" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Mar 19 09:28:13.243317 master-0 kubenswrapper[15202]: I0319 09:28:13.242196 15202 patch_prober.go:28] interesting pod/downloads-66b8ffb895-7n68q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Mar 19 09:28:13.243317 master-0 kubenswrapper[15202]: I0319 09:28:13.242756 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-7n68q" podUID="1dc7476c-75a8-40fe-93f7-fca31aa2ebda" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Mar 19 09:28:13.629103 master-0 kubenswrapper[15202]: I0319 09:28:13.628842 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" event={"ID":"5c2d7253-f08b-4aa3-b728-6012da77f513","Type":"ContainerStarted","Data":"bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd"} Mar 19 09:28:13.630486 master-0 kubenswrapper[15202]: I0319 09:28:13.630430 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:14.630378 master-0 kubenswrapper[15202]: I0319 09:28:14.630268 15202 patch_prober.go:28] interesting pod/oauth-openshift-5455ddcb95-p88pn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.97:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:14.631138 master-0 kubenswrapper[15202]: I0319 09:28:14.630408 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.97:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:15.639463 master-0 kubenswrapper[15202]: I0319 09:28:15.639343 15202 patch_prober.go:28] interesting pod/oauth-openshift-5455ddcb95-p88pn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:15.640365 master-0 kubenswrapper[15202]: I0319 09:28:15.639499 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:16.657873 master-0 kubenswrapper[15202]: I0319 09:28:16.657809 15202 patch_prober.go:28] interesting pod/oauth-openshift-5455ddcb95-p88pn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:16.657873 master-0 kubenswrapper[15202]: I0319 09:28:16.657878 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:20.606514 master-0 kubenswrapper[15202]: I0319 09:28:20.605977 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:28:20.606514 master-0 kubenswrapper[15202]: I0319 09:28:20.606052 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:28:20.778249 master-0 kubenswrapper[15202]: I0319 09:28:20.778160 15202 patch_prober.go:28] interesting pod/oauth-openshift-5455ddcb95-p88pn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:28:20.778528 master-0 kubenswrapper[15202]: I0319 09:28:20.778261 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.97:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:28:21.655711 master-0 kubenswrapper[15202]: I0319 09:28:21.655672 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:28:21.656262 master-0 kubenswrapper[15202]: I0319 09:28:21.656234 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:28:21.859365 master-0 kubenswrapper[15202]: I0319 09:28:21.859291 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:21.888698 master-0 kubenswrapper[15202]: I0319 09:28:21.888599 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:22.763089 master-0 kubenswrapper[15202]: I0319 09:28:22.763012 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:23.265232 master-0 kubenswrapper[15202]: I0319 09:28:23.265159 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-66b8ffb895-7n68q" Mar 19 09:28:29.495271 master-0 kubenswrapper[15202]: I0319 09:28:29.481130 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" podStartSLOduration=47.48110872 podStartE2EDuration="47.48110872s" podCreationTimestamp="2026-03-19 09:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:29.478669748 +0000 UTC m=+226.864084564" watchObservedRunningTime="2026-03-19 09:28:29.48110872 +0000 UTC m=+226.866523536" Mar 19 09:28:29.782160 master-0 kubenswrapper[15202]: I0319 09:28:29.782019 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:28:30.604554 master-0 kubenswrapper[15202]: I0319 09:28:30.604036 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:28:30.604554 master-0 kubenswrapper[15202]: I0319 09:28:30.604198 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:28:31.656545 master-0 kubenswrapper[15202]: I0319 09:28:31.656336 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:28:31.656545 master-0 kubenswrapper[15202]: I0319 09:28:31.656418 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:28:36.200667 master-0 kubenswrapper[15202]: I0319 09:28:36.200589 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:28:36.201493 master-0 kubenswrapper[15202]: I0319 09:28:36.201101 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="alertmanager" containerID="cri-o://12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1" gracePeriod=120 Mar 19 09:28:36.201493 master-0 kubenswrapper[15202]: I0319 09:28:36.201177 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-metric" containerID="cri-o://1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf" gracePeriod=120 Mar 19 09:28:36.201493 master-0 kubenswrapper[15202]: I0319 09:28:36.201312 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy" containerID="cri-o://e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de" gracePeriod=120 Mar 19 09:28:36.201493 master-0 kubenswrapper[15202]: I0319 09:28:36.201364 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="prom-label-proxy" containerID="cri-o://2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3" gracePeriod=120 Mar 19 09:28:36.201493 master-0 kubenswrapper[15202]: I0319 09:28:36.201411 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-web" containerID="cri-o://976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0" gracePeriod=120 Mar 19 09:28:36.201493 master-0 kubenswrapper[15202]: I0319 09:28:36.201489 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="config-reloader" containerID="cri-o://e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551" gracePeriod=120 Mar 19 09:28:36.385393 master-0 kubenswrapper[15202]: E0319 09:28:36.385281 15202 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb018304_4128_47a8_a4a6_39245f915703.slice/crio-2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb018304_4128_47a8_a4a6_39245f915703.slice/crio-12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb018304_4128_47a8_a4a6_39245f915703.slice/crio-conmon-12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb018304_4128_47a8_a4a6_39245f915703.slice/crio-conmon-e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:28:36.862238 master-0 kubenswrapper[15202]: I0319 09:28:36.862137 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3" exitCode=0 Mar 19 09:28:36.862238 master-0 kubenswrapper[15202]: I0319 09:28:36.862205 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de" exitCode=0 Mar 19 09:28:36.862238 master-0 kubenswrapper[15202]: I0319 09:28:36.862218 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551" exitCode=0 Mar 19 09:28:36.862238 master-0 kubenswrapper[15202]: I0319 09:28:36.862233 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1" exitCode=0 Mar 19 09:28:36.862883 master-0 kubenswrapper[15202]: I0319 09:28:36.862278 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3"} Mar 19 09:28:36.862883 master-0 kubenswrapper[15202]: I0319 09:28:36.862382 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de"} Mar 19 09:28:36.862883 master-0 kubenswrapper[15202]: I0319 09:28:36.862408 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551"} Mar 19 09:28:36.862883 master-0 kubenswrapper[15202]: I0319 09:28:36.862430 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1"} Mar 19 09:28:37.676603 master-0 kubenswrapper[15202]: I0319 09:28:37.676542 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:37.864022 master-0 kubenswrapper[15202]: I0319 09:28:37.863935 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-metrics-client-ca\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.864548 master-0 kubenswrapper[15202]: I0319 09:28:37.864527 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-web-config\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.864731 master-0 kubenswrapper[15202]: I0319 09:28:37.864693 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.864875 master-0 kubenswrapper[15202]: I0319 09:28:37.864856 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.865079 master-0 kubenswrapper[15202]: I0319 09:28:37.865060 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-main-db\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.865334 master-0 kubenswrapper[15202]: I0319 09:28:37.865297 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-kube-api-access-s6wk6\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.865527 master-0 kubenswrapper[15202]: I0319 09:28:37.865510 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-tls-assets\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.865641 master-0 kubenswrapper[15202]: I0319 09:28:37.865625 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-config-out\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.865766 master-0 kubenswrapper[15202]: I0319 09:28:37.865747 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-config-volume\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.865911 master-0 kubenswrapper[15202]: I0319 09:28:37.865893 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-trusted-ca-bundle\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.866039 master-0 kubenswrapper[15202]: I0319 09:28:37.866019 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-web\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.866348 master-0 kubenswrapper[15202]: I0319 09:28:37.866331 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-main-tls\") pod \"bb018304-4128-47a8-a4a6-39245f915703\" (UID: \"bb018304-4128-47a8-a4a6-39245f915703\") " Mar 19 09:28:37.867706 master-0 kubenswrapper[15202]: I0319 09:28:37.865780 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:37.867794 master-0 kubenswrapper[15202]: I0319 09:28:37.866671 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:28:37.871992 master-0 kubenswrapper[15202]: I0319 09:28:37.871863 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:37.873330 master-0 kubenswrapper[15202]: I0319 09:28:37.873117 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-kube-api-access-s6wk6" (OuterVolumeSpecName: "kube-api-access-s6wk6") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "kube-api-access-s6wk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:37.873330 master-0 kubenswrapper[15202]: I0319 09:28:37.873231 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:37.875227 master-0 kubenswrapper[15202]: I0319 09:28:37.875169 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:37.876211 master-0 kubenswrapper[15202]: I0319 09:28:37.876116 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-config-out" (OuterVolumeSpecName: "config-out") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:28:37.876626 master-0 kubenswrapper[15202]: I0319 09:28:37.876557 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:37.877653 master-0 kubenswrapper[15202]: I0319 09:28:37.876782 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:37.877653 master-0 kubenswrapper[15202]: I0319 09:28:37.877214 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:37.878277 master-0 kubenswrapper[15202]: I0319 09:28:37.878238 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:37.884113 master-0 kubenswrapper[15202]: I0319 09:28:37.884065 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf" exitCode=0 Mar 19 09:28:37.884113 master-0 kubenswrapper[15202]: I0319 09:28:37.884109 15202 generic.go:334] "Generic (PLEG): container finished" podID="bb018304-4128-47a8-a4a6-39245f915703" containerID="976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0" exitCode=0 Mar 19 09:28:37.884270 master-0 kubenswrapper[15202]: I0319 09:28:37.884138 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf"} Mar 19 09:28:37.884270 master-0 kubenswrapper[15202]: I0319 09:28:37.884188 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0"} Mar 19 09:28:37.884270 master-0 kubenswrapper[15202]: I0319 09:28:37.884197 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb018304-4128-47a8-a4a6-39245f915703","Type":"ContainerDied","Data":"161b2a221d5f01568762c88faef403a00e73faedbb819207dd179cf3b76fe882"} Mar 19 09:28:37.884270 master-0 kubenswrapper[15202]: I0319 09:28:37.884219 15202 scope.go:117] "RemoveContainer" containerID="2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3" Mar 19 09:28:37.884431 master-0 kubenswrapper[15202]: I0319 09:28:37.884413 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:37.929516 master-0 kubenswrapper[15202]: I0319 09:28:37.929388 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-web-config" (OuterVolumeSpecName: "web-config") pod "bb018304-4128-47a8-a4a6-39245f915703" (UID: "bb018304-4128-47a8-a4a6-39245f915703"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:37.929752 master-0 kubenswrapper[15202]: I0319 09:28:37.929614 15202 scope.go:117] "RemoveContainer" containerID="1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf" Mar 19 09:28:37.948870 master-0 kubenswrapper[15202]: I0319 09:28:37.948807 15202 scope.go:117] "RemoveContainer" containerID="e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de" Mar 19 09:28:37.969055 master-0 kubenswrapper[15202]: I0319 09:28:37.968990 15202 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969055 master-0 kubenswrapper[15202]: I0319 09:28:37.969039 15202 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969055 master-0 kubenswrapper[15202]: I0319 09:28:37.969056 15202 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969055 master-0 kubenswrapper[15202]: I0319 09:28:37.969069 15202 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb018304-4128-47a8-a4a6-39245f915703-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969084 15202 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-web-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969097 15202 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969107 15202 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969120 15202 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969131 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-kube-api-access-s6wk6\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969140 15202 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb018304-4128-47a8-a4a6-39245f915703-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969153 15202 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb018304-4128-47a8-a4a6-39245f915703-config-out\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.969411 master-0 kubenswrapper[15202]: I0319 09:28:37.969164 15202 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb018304-4128-47a8-a4a6-39245f915703-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:37.970134 master-0 kubenswrapper[15202]: I0319 09:28:37.970104 15202 scope.go:117] "RemoveContainer" containerID="976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0" Mar 19 09:28:37.985931 master-0 kubenswrapper[15202]: I0319 09:28:37.985888 15202 scope.go:117] "RemoveContainer" containerID="e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551" Mar 19 09:28:38.012581 master-0 kubenswrapper[15202]: I0319 09:28:38.012525 15202 scope.go:117] "RemoveContainer" containerID="12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1" Mar 19 09:28:38.036552 master-0 kubenswrapper[15202]: I0319 09:28:38.036454 15202 scope.go:117] "RemoveContainer" containerID="20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca" Mar 19 09:28:38.063368 master-0 kubenswrapper[15202]: I0319 09:28:38.063305 15202 scope.go:117] "RemoveContainer" containerID="2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3" Mar 19 09:28:38.063920 master-0 kubenswrapper[15202]: E0319 09:28:38.063868 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3\": container with ID starting with 2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3 not found: ID does not exist" containerID="2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3" Mar 19 09:28:38.064061 master-0 kubenswrapper[15202]: I0319 09:28:38.064022 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3"} err="failed to get container status \"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3\": rpc error: code = NotFound desc = could not find container \"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3\": container with ID starting with 2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3 not found: ID does not exist" Mar 19 09:28:38.064161 master-0 kubenswrapper[15202]: I0319 09:28:38.064144 15202 scope.go:117] "RemoveContainer" containerID="1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf" Mar 19 09:28:38.064759 master-0 kubenswrapper[15202]: E0319 09:28:38.064717 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf\": container with ID starting with 1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf not found: ID does not exist" containerID="1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf" Mar 19 09:28:38.064824 master-0 kubenswrapper[15202]: I0319 09:28:38.064762 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf"} err="failed to get container status \"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf\": rpc error: code = NotFound desc = could not find container \"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf\": container with ID starting with 1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf not found: ID does not exist" Mar 19 09:28:38.064824 master-0 kubenswrapper[15202]: I0319 09:28:38.064795 15202 scope.go:117] "RemoveContainer" containerID="e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de" Mar 19 09:28:38.065381 master-0 kubenswrapper[15202]: E0319 09:28:38.065358 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de\": container with ID starting with e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de not found: ID does not exist" containerID="e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de" Mar 19 09:28:38.065487 master-0 kubenswrapper[15202]: I0319 09:28:38.065450 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de"} err="failed to get container status \"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de\": rpc error: code = NotFound desc = could not find container \"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de\": container with ID starting with e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de not found: ID does not exist" Mar 19 09:28:38.065560 master-0 kubenswrapper[15202]: I0319 09:28:38.065547 15202 scope.go:117] "RemoveContainer" containerID="976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0" Mar 19 09:28:38.066065 master-0 kubenswrapper[15202]: E0319 09:28:38.066005 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0\": container with ID starting with 976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0 not found: ID does not exist" containerID="976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0" Mar 19 09:28:38.066121 master-0 kubenswrapper[15202]: I0319 09:28:38.066076 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0"} err="failed to get container status \"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0\": rpc error: code = NotFound desc = could not find container \"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0\": container with ID starting with 976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0 not found: ID does not exist" Mar 19 09:28:38.066121 master-0 kubenswrapper[15202]: I0319 09:28:38.066115 15202 scope.go:117] "RemoveContainer" containerID="e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551" Mar 19 09:28:38.073933 master-0 kubenswrapper[15202]: E0319 09:28:38.073765 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551\": container with ID starting with e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551 not found: ID does not exist" containerID="e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551" Mar 19 09:28:38.073933 master-0 kubenswrapper[15202]: I0319 09:28:38.073824 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551"} err="failed to get container status \"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551\": rpc error: code = NotFound desc = could not find container \"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551\": container with ID starting with e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551 not found: ID does not exist" Mar 19 09:28:38.073933 master-0 kubenswrapper[15202]: I0319 09:28:38.073861 15202 scope.go:117] "RemoveContainer" containerID="12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1" Mar 19 09:28:38.076283 master-0 kubenswrapper[15202]: E0319 09:28:38.076236 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1\": container with ID starting with 12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1 not found: ID does not exist" containerID="12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1" Mar 19 09:28:38.076452 master-0 kubenswrapper[15202]: I0319 09:28:38.076286 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1"} err="failed to get container status \"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1\": rpc error: code = NotFound desc = could not find container \"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1\": container with ID starting with 12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1 not found: ID does not exist" Mar 19 09:28:38.076452 master-0 kubenswrapper[15202]: I0319 09:28:38.076314 15202 scope.go:117] "RemoveContainer" containerID="20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca" Mar 19 09:28:38.077493 master-0 kubenswrapper[15202]: E0319 09:28:38.077367 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca\": container with ID starting with 20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca not found: ID does not exist" containerID="20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca" Mar 19 09:28:38.077493 master-0 kubenswrapper[15202]: I0319 09:28:38.077438 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca"} err="failed to get container status \"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca\": rpc error: code = NotFound desc = could not find container \"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca\": container with ID starting with 20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca not found: ID does not exist" Mar 19 09:28:38.077493 master-0 kubenswrapper[15202]: I0319 09:28:38.077493 15202 scope.go:117] "RemoveContainer" containerID="2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3" Mar 19 09:28:38.078586 master-0 kubenswrapper[15202]: I0319 09:28:38.078198 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3"} err="failed to get container status \"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3\": rpc error: code = NotFound desc = could not find container \"2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3\": container with ID starting with 2d9b9b03a6bf414978debe90df49fb88980f4e39d8d306d69fdf44878a689cf3 not found: ID does not exist" Mar 19 09:28:38.078586 master-0 kubenswrapper[15202]: I0319 09:28:38.078288 15202 scope.go:117] "RemoveContainer" containerID="1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf" Mar 19 09:28:38.078989 master-0 kubenswrapper[15202]: I0319 09:28:38.078788 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf"} err="failed to get container status \"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf\": rpc error: code = NotFound desc = could not find container \"1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf\": container with ID starting with 1525cf0594e885a933051a16a4cd5c12e4f9025e6b3a11dcb2f6b61dd2c8a4cf not found: ID does not exist" Mar 19 09:28:38.078989 master-0 kubenswrapper[15202]: I0319 09:28:38.078821 15202 scope.go:117] "RemoveContainer" containerID="e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de" Mar 19 09:28:38.079506 master-0 kubenswrapper[15202]: I0319 09:28:38.079451 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de"} err="failed to get container status \"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de\": rpc error: code = NotFound desc = could not find container \"e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de\": container with ID starting with e850cdd65e8cdb873de75af774985eb23be617ea4fc774693a3ec5b1ad9ee4de not found: ID does not exist" Mar 19 09:28:38.079579 master-0 kubenswrapper[15202]: I0319 09:28:38.079503 15202 scope.go:117] "RemoveContainer" containerID="976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0" Mar 19 09:28:38.079915 master-0 kubenswrapper[15202]: I0319 09:28:38.079857 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0"} err="failed to get container status \"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0\": rpc error: code = NotFound desc = could not find container \"976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0\": container with ID starting with 976f3ae7342c265b798745606e476d93bce7a69a12b9d987a0d81efe496f06b0 not found: ID does not exist" Mar 19 09:28:38.079915 master-0 kubenswrapper[15202]: I0319 09:28:38.079904 15202 scope.go:117] "RemoveContainer" containerID="e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551" Mar 19 09:28:38.080296 master-0 kubenswrapper[15202]: I0319 09:28:38.080262 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551"} err="failed to get container status \"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551\": rpc error: code = NotFound desc = could not find container \"e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551\": container with ID starting with e2e8171c68fa51eecc58d630d41dcc2e97a0a94ade52d9e97e4c868f35eb0551 not found: ID does not exist" Mar 19 09:28:38.080296 master-0 kubenswrapper[15202]: I0319 09:28:38.080287 15202 scope.go:117] "RemoveContainer" containerID="12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1" Mar 19 09:28:38.080654 master-0 kubenswrapper[15202]: I0319 09:28:38.080592 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1"} err="failed to get container status \"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1\": rpc error: code = NotFound desc = could not find container \"12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1\": container with ID starting with 12143769d74c556985759e05ec8f305ad52e674937e3b0593cea1070ca729db1 not found: ID does not exist" Mar 19 09:28:38.080654 master-0 kubenswrapper[15202]: I0319 09:28:38.080629 15202 scope.go:117] "RemoveContainer" containerID="20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca" Mar 19 09:28:38.081000 master-0 kubenswrapper[15202]: I0319 09:28:38.080965 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca"} err="failed to get container status \"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca\": rpc error: code = NotFound desc = could not find container \"20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca\": container with ID starting with 20bf3488845f04c76e8023e85e83fead98b82b3f4cf615f9841e76b46bd0d3ca not found: ID does not exist" Mar 19 09:28:38.227809 master-0 kubenswrapper[15202]: I0319 09:28:38.227710 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:28:38.241765 master-0 kubenswrapper[15202]: I0319 09:28:38.241681 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:28:38.285806 master-0 kubenswrapper[15202]: I0319 09:28:38.285736 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:28:38.286737 master-0 kubenswrapper[15202]: E0319 09:28:38.286710 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="init-config-reloader" Mar 19 09:28:38.286846 master-0 kubenswrapper[15202]: I0319 09:28:38.286832 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="init-config-reloader" Mar 19 09:28:38.286981 master-0 kubenswrapper[15202]: E0319 09:28:38.286964 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy" Mar 19 09:28:38.287054 master-0 kubenswrapper[15202]: I0319 09:28:38.287042 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy" Mar 19 09:28:38.287177 master-0 kubenswrapper[15202]: E0319 09:28:38.287164 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="alertmanager" Mar 19 09:28:38.287239 master-0 kubenswrapper[15202]: I0319 09:28:38.287229 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="alertmanager" Mar 19 09:28:38.287326 master-0 kubenswrapper[15202]: E0319 09:28:38.287314 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-web" Mar 19 09:28:38.287403 master-0 kubenswrapper[15202]: I0319 09:28:38.287391 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-web" Mar 19 09:28:38.287507 master-0 kubenswrapper[15202]: E0319 09:28:38.287493 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="prom-label-proxy" Mar 19 09:28:38.287595 master-0 kubenswrapper[15202]: I0319 09:28:38.287581 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="prom-label-proxy" Mar 19 09:28:38.287680 master-0 kubenswrapper[15202]: E0319 09:28:38.287666 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="config-reloader" Mar 19 09:28:38.287758 master-0 kubenswrapper[15202]: I0319 09:28:38.287745 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="config-reloader" Mar 19 09:28:38.290322 master-0 kubenswrapper[15202]: E0319 09:28:38.290289 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-metric" Mar 19 09:28:38.290518 master-0 kubenswrapper[15202]: I0319 09:28:38.290500 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-metric" Mar 19 09:28:38.290990 master-0 kubenswrapper[15202]: I0319 09:28:38.290969 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-web" Mar 19 09:28:38.291145 master-0 kubenswrapper[15202]: I0319 09:28:38.291112 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="prom-label-proxy" Mar 19 09:28:38.291242 master-0 kubenswrapper[15202]: I0319 09:28:38.291228 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy" Mar 19 09:28:38.291337 master-0 kubenswrapper[15202]: I0319 09:28:38.291324 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="alertmanager" Mar 19 09:28:38.291430 master-0 kubenswrapper[15202]: I0319 09:28:38.291416 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="config-reloader" Mar 19 09:28:38.291591 master-0 kubenswrapper[15202]: I0319 09:28:38.291575 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb018304-4128-47a8-a4a6-39245f915703" containerName="kube-rbac-proxy-metric" Mar 19 09:28:38.294771 master-0 kubenswrapper[15202]: I0319 09:28:38.294723 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.300898 master-0 kubenswrapper[15202]: I0319 09:28:38.300821 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:28:38.301240 master-0 kubenswrapper[15202]: I0319 09:28:38.301022 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:28:38.301240 master-0 kubenswrapper[15202]: I0319 09:28:38.301100 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:28:38.301240 master-0 kubenswrapper[15202]: I0319 09:28:38.301140 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:28:38.301801 master-0 kubenswrapper[15202]: I0319 09:28:38.301302 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:28:38.302404 master-0 kubenswrapper[15202]: I0319 09:28:38.302362 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:28:38.303416 master-0 kubenswrapper[15202]: I0319 09:28:38.303374 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:28:38.311560 master-0 kubenswrapper[15202]: I0319 09:28:38.311501 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:28:38.363885 master-0 kubenswrapper[15202]: I0319 09:28:38.363804 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:28:38.486280 master-0 kubenswrapper[15202]: I0319 09:28:38.486075 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80f89d04-6a07-4e86-b211-273789da32f2-config-out\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486280 master-0 kubenswrapper[15202]: I0319 09:28:38.486177 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486280 master-0 kubenswrapper[15202]: I0319 09:28:38.486237 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486280 master-0 kubenswrapper[15202]: I0319 09:28:38.486259 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80f89d04-6a07-4e86-b211-273789da32f2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486280 master-0 kubenswrapper[15202]: I0319 09:28:38.486290 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80f89d04-6a07-4e86-b211-273789da32f2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486324 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-config-volume\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486345 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486520 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486612 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80f89d04-6a07-4e86-b211-273789da32f2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486678 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-web-config\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486706 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f89d04-6a07-4e86-b211-273789da32f2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.486910 master-0 kubenswrapper[15202]: I0319 09:28:38.486744 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg7mh\" (UniqueName: \"kubernetes.io/projected/80f89d04-6a07-4e86-b211-273789da32f2-kube-api-access-jg7mh\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.588557 master-0 kubenswrapper[15202]: I0319 09:28:38.588490 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.588990 master-0 kubenswrapper[15202]: I0319 09:28:38.588961 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.589148 master-0 kubenswrapper[15202]: I0319 09:28:38.589128 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80f89d04-6a07-4e86-b211-273789da32f2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.589321 master-0 kubenswrapper[15202]: I0319 09:28:38.589295 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80f89d04-6a07-4e86-b211-273789da32f2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590011 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-config-volume\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590185 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590320 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590378 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80f89d04-6a07-4e86-b211-273789da32f2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590456 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-web-config\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590665 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f89d04-6a07-4e86-b211-273789da32f2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590750 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7mh\" (UniqueName: \"kubernetes.io/projected/80f89d04-6a07-4e86-b211-273789da32f2-kube-api-access-jg7mh\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.592463 master-0 kubenswrapper[15202]: I0319 09:28:38.590868 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80f89d04-6a07-4e86-b211-273789da32f2-config-out\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.596823 master-0 kubenswrapper[15202]: I0319 09:28:38.596763 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-config-volume\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.597069 master-0 kubenswrapper[15202]: I0319 09:28:38.596908 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80f89d04-6a07-4e86-b211-273789da32f2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.597228 master-0 kubenswrapper[15202]: I0319 09:28:38.597168 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80f89d04-6a07-4e86-b211-273789da32f2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.597379 master-0 kubenswrapper[15202]: I0319 09:28:38.597349 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.597445 master-0 kubenswrapper[15202]: I0319 09:28:38.597403 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80f89d04-6a07-4e86-b211-273789da32f2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.598012 master-0 kubenswrapper[15202]: I0319 09:28:38.597981 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-web-config\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.598825 master-0 kubenswrapper[15202]: I0319 09:28:38.598756 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.599220 master-0 kubenswrapper[15202]: I0319 09:28:38.599175 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.600508 master-0 kubenswrapper[15202]: I0319 09:28:38.600419 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80f89d04-6a07-4e86-b211-273789da32f2-config-out\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.601658 master-0 kubenswrapper[15202]: I0319 09:28:38.601610 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80f89d04-6a07-4e86-b211-273789da32f2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.603845 master-0 kubenswrapper[15202]: I0319 09:28:38.603800 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f89d04-6a07-4e86-b211-273789da32f2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.727496 master-0 kubenswrapper[15202]: I0319 09:28:38.725903 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7mh\" (UniqueName: \"kubernetes.io/projected/80f89d04-6a07-4e86-b211-273789da32f2-kube-api-access-jg7mh\") pod \"alertmanager-main-0\" (UID: \"80f89d04-6a07-4e86-b211-273789da32f2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:38.821627 master-0 kubenswrapper[15202]: I0319 09:28:38.821358 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb018304-4128-47a8-a4a6-39245f915703" path="/var/lib/kubelet/pods/bb018304-4128-47a8-a4a6-39245f915703/volumes" Mar 19 09:28:38.933689 master-0 kubenswrapper[15202]: I0319 09:28:38.933609 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:28:39.407397 master-0 kubenswrapper[15202]: I0319 09:28:39.407224 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:28:39.418775 master-0 kubenswrapper[15202]: W0319 09:28:39.418713 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f89d04_6a07_4e86_b211_273789da32f2.slice/crio-3256adbac57629ee3621577f25451fcf091691127c3081ad56785796a1c9d69d WatchSource:0}: Error finding container 3256adbac57629ee3621577f25451fcf091691127c3081ad56785796a1c9d69d: Status 404 returned error can't find the container with id 3256adbac57629ee3621577f25451fcf091691127c3081ad56785796a1c9d69d Mar 19 09:28:39.903753 master-0 kubenswrapper[15202]: I0319 09:28:39.903636 15202 generic.go:334] "Generic (PLEG): container finished" podID="80f89d04-6a07-4e86-b211-273789da32f2" containerID="2d86d8caa30ec4a5dba51bbdbe4d7bfb8a0bdd1a07ab60f7c069673813c422ad" exitCode=0 Mar 19 09:28:39.905553 master-0 kubenswrapper[15202]: I0319 09:28:39.903732 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerDied","Data":"2d86d8caa30ec4a5dba51bbdbe4d7bfb8a0bdd1a07ab60f7c069673813c422ad"} Mar 19 09:28:39.905553 master-0 kubenswrapper[15202]: I0319 09:28:39.903829 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"3256adbac57629ee3621577f25451fcf091691127c3081ad56785796a1c9d69d"} Mar 19 09:28:40.510448 master-0 kubenswrapper[15202]: I0319 09:28:40.510339 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-678cbbd786-bf7l4"] Mar 19 09:28:40.512761 master-0 kubenswrapper[15202]: I0319 09:28:40.512431 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.518244 master-0 kubenswrapper[15202]: I0319 09:28:40.518191 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 09:28:40.518390 master-0 kubenswrapper[15202]: I0319 09:28:40.518332 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 09:28:40.518554 master-0 kubenswrapper[15202]: I0319 09:28:40.518523 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 09:28:40.518778 master-0 kubenswrapper[15202]: I0319 09:28:40.518750 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 09:28:40.518850 master-0 kubenswrapper[15202]: I0319 09:28:40.518796 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 09:28:40.522177 master-0 kubenswrapper[15202]: I0319 09:28:40.522138 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533637 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-metrics-client-ca\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533714 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwklr\" (UniqueName: \"kubernetes.io/projected/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-kube-api-access-qwklr\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533756 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-telemeter-client-tls\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533785 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533809 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-serving-certs-ca-bundle\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533849 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-secret-telemeter-client\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533871 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-federate-client-tls\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.538488 master-0 kubenswrapper[15202]: I0319 09:28:40.533931 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-telemeter-trusted-ca-bundle\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.542333 master-0 kubenswrapper[15202]: I0319 09:28:40.542065 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-678cbbd786-bf7l4"] Mar 19 09:28:40.613916 master-0 kubenswrapper[15202]: I0319 09:28:40.605502 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:28:40.613916 master-0 kubenswrapper[15202]: I0319 09:28:40.605594 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:28:40.635885 master-0 kubenswrapper[15202]: I0319 09:28:40.635799 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-telemeter-client-tls\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.636771 master-0 kubenswrapper[15202]: I0319 09:28:40.636738 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.638282 master-0 kubenswrapper[15202]: I0319 09:28:40.638221 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-serving-certs-ca-bundle\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.638386 master-0 kubenswrapper[15202]: I0319 09:28:40.638367 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-serving-certs-ca-bundle\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.638579 master-0 kubenswrapper[15202]: I0319 09:28:40.638561 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-secret-telemeter-client\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.639134 master-0 kubenswrapper[15202]: I0319 09:28:40.639117 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-federate-client-tls\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.639387 master-0 kubenswrapper[15202]: I0319 09:28:40.639373 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-telemeter-trusted-ca-bundle\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.639607 master-0 kubenswrapper[15202]: I0319 09:28:40.639593 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-metrics-client-ca\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.639745 master-0 kubenswrapper[15202]: I0319 09:28:40.639732 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwklr\" (UniqueName: \"kubernetes.io/projected/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-kube-api-access-qwklr\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.640812 master-0 kubenswrapper[15202]: I0319 09:28:40.640748 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-telemeter-trusted-ca-bundle\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.640932 master-0 kubenswrapper[15202]: I0319 09:28:40.640876 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-metrics-client-ca\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.643752 master-0 kubenswrapper[15202]: I0319 09:28:40.643542 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-secret-telemeter-client\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.646863 master-0 kubenswrapper[15202]: I0319 09:28:40.645136 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.649142 master-0 kubenswrapper[15202]: I0319 09:28:40.649116 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-federate-client-tls\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.658524 master-0 kubenswrapper[15202]: I0319 09:28:40.658024 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-telemeter-client-tls\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.659621 master-0 kubenswrapper[15202]: I0319 09:28:40.659528 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwklr\" (UniqueName: \"kubernetes.io/projected/fca3be47-3f1e-4b84-be7f-dffa5ce46d08-kube-api-access-qwklr\") pod \"telemeter-client-678cbbd786-bf7l4\" (UID: \"fca3be47-3f1e-4b84-be7f-dffa5ce46d08\") " pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.720931 master-0 kubenswrapper[15202]: I0319 09:28:40.720770 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:28:40.721703 master-0 kubenswrapper[15202]: I0319 09:28:40.721662 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="prometheus" containerID="cri-o://3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e" gracePeriod=600 Mar 19 09:28:40.722854 master-0 kubenswrapper[15202]: I0319 09:28:40.722227 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy" containerID="cri-o://6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2" gracePeriod=600 Mar 19 09:28:40.722854 master-0 kubenswrapper[15202]: I0319 09:28:40.722552 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-thanos" containerID="cri-o://684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a" gracePeriod=600 Mar 19 09:28:40.722854 master-0 kubenswrapper[15202]: I0319 09:28:40.722629 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="thanos-sidecar" containerID="cri-o://1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8" gracePeriod=600 Mar 19 09:28:40.722854 master-0 kubenswrapper[15202]: I0319 09:28:40.722683 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-web" containerID="cri-o://f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2" gracePeriod=600 Mar 19 09:28:40.722854 master-0 kubenswrapper[15202]: I0319 09:28:40.722734 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="config-reloader" containerID="cri-o://503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b" gracePeriod=600 Mar 19 09:28:40.881494 master-0 kubenswrapper[15202]: I0319 09:28:40.881404 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954204 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2" exitCode=0 Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954251 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8" exitCode=0 Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954260 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b" exitCode=0 Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954268 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e" exitCode=0 Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954323 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2"} Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954362 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8"} Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954374 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b"} Mar 19 09:28:40.958391 master-0 kubenswrapper[15202]: I0319 09:28:40.954384 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e"} Mar 19 09:28:40.964670 master-0 kubenswrapper[15202]: I0319 09:28:40.964599 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"5a4e47f4b7eee097fe512d3b16fe7d0488931ed76ae0dc8bdc807d82184aecbe"} Mar 19 09:28:40.964670 master-0 kubenswrapper[15202]: I0319 09:28:40.964648 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"d5855b5ef54408bdfd0ed0e63b96577bfe881446fec2d31adc63178567d72d1f"} Mar 19 09:28:40.964670 master-0 kubenswrapper[15202]: I0319 09:28:40.964661 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"a420f3d4dfd453d020155a2e35742b59428def369cb1a5fa0346f4f59818e9a0"} Mar 19 09:28:40.964798 master-0 kubenswrapper[15202]: I0319 09:28:40.964675 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"aed8ee690f85926fdeadb86e1f204b10966a1f47be4d39267bd4380b5bbf6de5"} Mar 19 09:28:40.964798 master-0 kubenswrapper[15202]: I0319 09:28:40.964689 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"b142e07205fc159665564afa754c4daac01d1cc33463b197267bbf903ae7cfd5"} Mar 19 09:28:41.429474 master-0 kubenswrapper[15202]: I0319 09:28:41.429407 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-678cbbd786-bf7l4"] Mar 19 09:28:41.477566 master-0 kubenswrapper[15202]: I0319 09:28:41.477515 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:41.656060 master-0 kubenswrapper[15202]: I0319 09:28:41.655860 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:28:41.656060 master-0 kubenswrapper[15202]: I0319 09:28:41.655948 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:28:41.672579 master-0 kubenswrapper[15202]: I0319 09:28:41.672456 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-web-config\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.672945 master-0 kubenswrapper[15202]: I0319 09:28:41.672633 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-rulefiles-0\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.672945 master-0 kubenswrapper[15202]: I0319 09:28:41.672692 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.672945 master-0 kubenswrapper[15202]: I0319 09:28:41.672756 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-tls-assets\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.672945 master-0 kubenswrapper[15202]: I0319 09:28:41.672809 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-metrics-client-certs\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.672945 master-0 kubenswrapper[15202]: I0319 09:28:41.672840 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-config\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.672945 master-0 kubenswrapper[15202]: I0319 09:28:41.672878 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-kube-rbac-proxy\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.672947 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-config-out\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.672981 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-metrics-client-ca\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.673012 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-serving-certs-ca-bundle\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.673061 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-kubelet-serving-ca-bundle\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.673121 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-thanos-prometheus-http-client-file\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.673160 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-tls\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673211 master-0 kubenswrapper[15202]: I0319 09:28:41.673194 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673513 master-0 kubenswrapper[15202]: I0319 09:28:41.673227 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6t4k\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-kube-api-access-l6t4k\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673513 master-0 kubenswrapper[15202]: I0319 09:28:41.673269 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-db\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673513 master-0 kubenswrapper[15202]: I0319 09:28:41.673317 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-grpc-tls\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.673513 master-0 kubenswrapper[15202]: I0319 09:28:41.673430 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-trusted-ca-bundle\") pod \"75eefc3a-d29d-499e-98fd-7292ff09c294\" (UID: \"75eefc3a-d29d-499e-98fd-7292ff09c294\") " Mar 19 09:28:41.674622 master-0 kubenswrapper[15202]: I0319 09:28:41.674569 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:41.674960 master-0 kubenswrapper[15202]: I0319 09:28:41.674578 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:41.676274 master-0 kubenswrapper[15202]: I0319 09:28:41.676204 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:41.676672 master-0 kubenswrapper[15202]: I0319 09:28:41.676633 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:41.678841 master-0 kubenswrapper[15202]: I0319 09:28:41.678602 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.679695 master-0 kubenswrapper[15202]: I0319 09:28:41.678867 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.679695 master-0 kubenswrapper[15202]: I0319 09:28:41.679514 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:41.680234 master-0 kubenswrapper[15202]: I0319 09:28:41.679874 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-config" (OuterVolumeSpecName: "config") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.680234 master-0 kubenswrapper[15202]: I0319 09:28:41.680167 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:41.680930 master-0 kubenswrapper[15202]: I0319 09:28:41.680906 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:28:41.681813 master-0 kubenswrapper[15202]: I0319 09:28:41.681768 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.682748 master-0 kubenswrapper[15202]: I0319 09:28:41.682711 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.682860 master-0 kubenswrapper[15202]: I0319 09:28:41.682816 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-kube-api-access-l6t4k" (OuterVolumeSpecName: "kube-api-access-l6t4k") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "kube-api-access-l6t4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:41.683096 master-0 kubenswrapper[15202]: I0319 09:28:41.683037 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.683217 master-0 kubenswrapper[15202]: I0319 09:28:41.683176 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.683768 master-0 kubenswrapper[15202]: I0319 09:28:41.683732 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.685482 master-0 kubenswrapper[15202]: I0319 09:28:41.685394 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-config-out" (OuterVolumeSpecName: "config-out") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:28:41.725863 master-0 kubenswrapper[15202]: I0319 09:28:41.725752 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-web-config" (OuterVolumeSpecName: "web-config") pod "75eefc3a-d29d-499e-98fd-7292ff09c294" (UID: "75eefc3a-d29d-499e-98fd-7292ff09c294"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775787 15202 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-config-out\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775827 15202 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775841 15202 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775852 15202 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775864 15202 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775874 15202 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775885 15202 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775896 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6t4k\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-kube-api-access-l6t4k\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775907 15202 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775919 15202 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775931 15202 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775940 15202 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-web-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775952 15202 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75eefc3a-d29d-499e-98fd-7292ff09c294-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775965 15202 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775975 15202 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75eefc3a-d29d-499e-98fd-7292ff09c294-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775984 15202 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.775994 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.776031 master-0 kubenswrapper[15202]: I0319 09:28:41.776005 15202 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75eefc3a-d29d-499e-98fd-7292ff09c294-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:41.978204 master-0 kubenswrapper[15202]: I0319 09:28:41.978111 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a" exitCode=0 Mar 19 09:28:41.978204 master-0 kubenswrapper[15202]: I0319 09:28:41.978182 15202 generic.go:334] "Generic (PLEG): container finished" podID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerID="6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2" exitCode=0 Mar 19 09:28:41.979581 master-0 kubenswrapper[15202]: I0319 09:28:41.978239 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:41.979581 master-0 kubenswrapper[15202]: I0319 09:28:41.978276 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a"} Mar 19 09:28:41.979581 master-0 kubenswrapper[15202]: I0319 09:28:41.978390 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2"} Mar 19 09:28:41.979581 master-0 kubenswrapper[15202]: I0319 09:28:41.978413 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75eefc3a-d29d-499e-98fd-7292ff09c294","Type":"ContainerDied","Data":"72318015826a9a6df10b0ba1b3beedcda5e582b26dc3e68735c4af6fa4b42d32"} Mar 19 09:28:41.979581 master-0 kubenswrapper[15202]: I0319 09:28:41.978445 15202 scope.go:117] "RemoveContainer" containerID="684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a" Mar 19 09:28:41.989226 master-0 kubenswrapper[15202]: I0319 09:28:41.989100 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80f89d04-6a07-4e86-b211-273789da32f2","Type":"ContainerStarted","Data":"8001d0d6ccbd40fed90f874c33b19299d833313b4ce150fde9e7dfa24105e1c1"} Mar 19 09:28:41.996496 master-0 kubenswrapper[15202]: I0319 09:28:41.996396 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" event={"ID":"fca3be47-3f1e-4b84-be7f-dffa5ce46d08","Type":"ContainerStarted","Data":"3d3b66025e791ccb63bc6fc0d10f69dc4d37bed96e8eb487143dfba768876472"} Mar 19 09:28:42.018403 master-0 kubenswrapper[15202]: I0319 09:28:42.018328 15202 scope.go:117] "RemoveContainer" containerID="6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2" Mar 19 09:28:42.043674 master-0 kubenswrapper[15202]: I0319 09:28:42.043524 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.043485487 podStartE2EDuration="4.043485487s" podCreationTimestamp="2026-03-19 09:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:42.03494543 +0000 UTC m=+239.420360297" watchObservedRunningTime="2026-03-19 09:28:42.043485487 +0000 UTC m=+239.428900313" Mar 19 09:28:42.096928 master-0 kubenswrapper[15202]: I0319 09:28:42.070716 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:28:42.096928 master-0 kubenswrapper[15202]: I0319 09:28:42.071427 15202 scope.go:117] "RemoveContainer" containerID="f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2" Mar 19 09:28:42.114932 master-0 kubenswrapper[15202]: I0319 09:28:42.114855 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:28:42.134308 master-0 kubenswrapper[15202]: I0319 09:28:42.134251 15202 scope.go:117] "RemoveContainer" containerID="1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8" Mar 19 09:28:42.151199 master-0 kubenswrapper[15202]: I0319 09:28:42.151124 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151520 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="init-config-reloader" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151541 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="init-config-reloader" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151572 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-thanos" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151580 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-thanos" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151598 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="prometheus" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151605 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="prometheus" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151620 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="config-reloader" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151626 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="config-reloader" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151647 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="thanos-sidecar" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151653 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="thanos-sidecar" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151674 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151682 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: E0319 09:28:42.151692 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-web" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151698 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-web" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151833 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-web" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151856 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy-thanos" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151883 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="kube-rbac-proxy" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151896 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="prometheus" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151912 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="thanos-sidecar" Mar 19 09:28:42.151973 master-0 kubenswrapper[15202]: I0319 09:28:42.151924 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" containerName="config-reloader" Mar 19 09:28:42.156005 master-0 kubenswrapper[15202]: I0319 09:28:42.155942 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.161258 master-0 kubenswrapper[15202]: I0319 09:28:42.161206 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:28:42.161548 master-0 kubenswrapper[15202]: I0319 09:28:42.161489 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:28:42.161548 master-0 kubenswrapper[15202]: I0319 09:28:42.161512 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:28:42.161824 master-0 kubenswrapper[15202]: I0319 09:28:42.161795 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:28:42.162020 master-0 kubenswrapper[15202]: I0319 09:28:42.161993 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:28:42.162177 master-0 kubenswrapper[15202]: I0319 09:28:42.162152 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:28:42.162352 master-0 kubenswrapper[15202]: I0319 09:28:42.162325 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4i3vpe46p0rrq" Mar 19 09:28:42.162562 master-0 kubenswrapper[15202]: I0319 09:28:42.162535 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:28:42.162729 master-0 kubenswrapper[15202]: I0319 09:28:42.162701 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:28:42.169682 master-0 kubenswrapper[15202]: I0319 09:28:42.169592 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:28:42.170577 master-0 kubenswrapper[15202]: I0319 09:28:42.170538 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:28:42.173135 master-0 kubenswrapper[15202]: I0319 09:28:42.173088 15202 scope.go:117] "RemoveContainer" containerID="503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b" Mar 19 09:28:42.176850 master-0 kubenswrapper[15202]: I0319 09:28:42.176800 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:28:42.185618 master-0 kubenswrapper[15202]: I0319 09:28:42.185557 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:28:42.235667 master-0 kubenswrapper[15202]: I0319 09:28:42.235586 15202 scope.go:117] "RemoveContainer" containerID="3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e" Mar 19 09:28:42.254299 master-0 kubenswrapper[15202]: I0319 09:28:42.254248 15202 scope.go:117] "RemoveContainer" containerID="909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33" Mar 19 09:28:42.273424 master-0 kubenswrapper[15202]: I0319 09:28:42.273380 15202 scope.go:117] "RemoveContainer" containerID="684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a" Mar 19 09:28:42.274295 master-0 kubenswrapper[15202]: E0319 09:28:42.274249 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a\": container with ID starting with 684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a not found: ID does not exist" containerID="684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a" Mar 19 09:28:42.275343 master-0 kubenswrapper[15202]: I0319 09:28:42.275278 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a"} err="failed to get container status \"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a\": rpc error: code = NotFound desc = could not find container \"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a\": container with ID starting with 684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a not found: ID does not exist" Mar 19 09:28:42.275343 master-0 kubenswrapper[15202]: I0319 09:28:42.275337 15202 scope.go:117] "RemoveContainer" containerID="6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2" Mar 19 09:28:42.275866 master-0 kubenswrapper[15202]: E0319 09:28:42.275777 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2\": container with ID starting with 6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2 not found: ID does not exist" containerID="6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2" Mar 19 09:28:42.275939 master-0 kubenswrapper[15202]: I0319 09:28:42.275866 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2"} err="failed to get container status \"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2\": rpc error: code = NotFound desc = could not find container \"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2\": container with ID starting with 6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2 not found: ID does not exist" Mar 19 09:28:42.275939 master-0 kubenswrapper[15202]: I0319 09:28:42.275925 15202 scope.go:117] "RemoveContainer" containerID="f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2" Mar 19 09:28:42.276770 master-0 kubenswrapper[15202]: E0319 09:28:42.276392 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2\": container with ID starting with f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2 not found: ID does not exist" containerID="f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2" Mar 19 09:28:42.276770 master-0 kubenswrapper[15202]: I0319 09:28:42.276669 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2"} err="failed to get container status \"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2\": rpc error: code = NotFound desc = could not find container \"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2\": container with ID starting with f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2 not found: ID does not exist" Mar 19 09:28:42.276770 master-0 kubenswrapper[15202]: I0319 09:28:42.276716 15202 scope.go:117] "RemoveContainer" containerID="1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8" Mar 19 09:28:42.277779 master-0 kubenswrapper[15202]: E0319 09:28:42.277601 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8\": container with ID starting with 1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8 not found: ID does not exist" containerID="1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8" Mar 19 09:28:42.277779 master-0 kubenswrapper[15202]: I0319 09:28:42.277632 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8"} err="failed to get container status \"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8\": rpc error: code = NotFound desc = could not find container \"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8\": container with ID starting with 1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8 not found: ID does not exist" Mar 19 09:28:42.277779 master-0 kubenswrapper[15202]: I0319 09:28:42.277649 15202 scope.go:117] "RemoveContainer" containerID="503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b" Mar 19 09:28:42.278214 master-0 kubenswrapper[15202]: E0319 09:28:42.278174 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b\": container with ID starting with 503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b not found: ID does not exist" containerID="503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b" Mar 19 09:28:42.278288 master-0 kubenswrapper[15202]: I0319 09:28:42.278215 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b"} err="failed to get container status \"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b\": rpc error: code = NotFound desc = could not find container \"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b\": container with ID starting with 503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b not found: ID does not exist" Mar 19 09:28:42.278288 master-0 kubenswrapper[15202]: I0319 09:28:42.278234 15202 scope.go:117] "RemoveContainer" containerID="3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e" Mar 19 09:28:42.278600 master-0 kubenswrapper[15202]: E0319 09:28:42.278567 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e\": container with ID starting with 3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e not found: ID does not exist" containerID="3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e" Mar 19 09:28:42.278662 master-0 kubenswrapper[15202]: I0319 09:28:42.278603 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e"} err="failed to get container status \"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e\": rpc error: code = NotFound desc = could not find container \"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e\": container with ID starting with 3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e not found: ID does not exist" Mar 19 09:28:42.278662 master-0 kubenswrapper[15202]: I0319 09:28:42.278625 15202 scope.go:117] "RemoveContainer" containerID="909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33" Mar 19 09:28:42.279052 master-0 kubenswrapper[15202]: E0319 09:28:42.279020 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33\": container with ID starting with 909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33 not found: ID does not exist" containerID="909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33" Mar 19 09:28:42.279123 master-0 kubenswrapper[15202]: I0319 09:28:42.279054 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33"} err="failed to get container status \"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33\": rpc error: code = NotFound desc = could not find container \"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33\": container with ID starting with 909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33 not found: ID does not exist" Mar 19 09:28:42.279123 master-0 kubenswrapper[15202]: I0319 09:28:42.279084 15202 scope.go:117] "RemoveContainer" containerID="684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a" Mar 19 09:28:42.279436 master-0 kubenswrapper[15202]: I0319 09:28:42.279402 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a"} err="failed to get container status \"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a\": rpc error: code = NotFound desc = could not find container \"684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a\": container with ID starting with 684f8df7cdebcf6680dbfb5c75e9868bbd6d4f4cc291ae943200aa6c8db4819a not found: ID does not exist" Mar 19 09:28:42.279436 master-0 kubenswrapper[15202]: I0319 09:28:42.279429 15202 scope.go:117] "RemoveContainer" containerID="6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2" Mar 19 09:28:42.279943 master-0 kubenswrapper[15202]: I0319 09:28:42.279892 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2"} err="failed to get container status \"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2\": rpc error: code = NotFound desc = could not find container \"6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2\": container with ID starting with 6c5d5abacc6c17c4613c77a2b81cd0c8abf2f4d8e268e91f9579d99e0a4b68f2 not found: ID does not exist" Mar 19 09:28:42.280027 master-0 kubenswrapper[15202]: I0319 09:28:42.279944 15202 scope.go:117] "RemoveContainer" containerID="f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2" Mar 19 09:28:42.280913 master-0 kubenswrapper[15202]: I0319 09:28:42.280777 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2"} err="failed to get container status \"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2\": rpc error: code = NotFound desc = could not find container \"f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2\": container with ID starting with f7795f62072eff076f0b9a653e15ebed2e12bf57f9d7fe52f8ce8dfbcab502e2 not found: ID does not exist" Mar 19 09:28:42.281100 master-0 kubenswrapper[15202]: I0319 09:28:42.280916 15202 scope.go:117] "RemoveContainer" containerID="1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8" Mar 19 09:28:42.281504 master-0 kubenswrapper[15202]: I0319 09:28:42.281390 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8"} err="failed to get container status \"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8\": rpc error: code = NotFound desc = could not find container \"1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8\": container with ID starting with 1f987d4089b52797c44a0ca4f295b882b55c19156d1b48f219c7427549c47fb8 not found: ID does not exist" Mar 19 09:28:42.281504 master-0 kubenswrapper[15202]: I0319 09:28:42.281488 15202 scope.go:117] "RemoveContainer" containerID="503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b" Mar 19 09:28:42.281859 master-0 kubenswrapper[15202]: I0319 09:28:42.281819 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b"} err="failed to get container status \"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b\": rpc error: code = NotFound desc = could not find container \"503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b\": container with ID starting with 503c9f1a0f93b016bead6c3949304802d77161eeb84de3e4cf3788e902a72e8b not found: ID does not exist" Mar 19 09:28:42.281859 master-0 kubenswrapper[15202]: I0319 09:28:42.281837 15202 scope.go:117] "RemoveContainer" containerID="3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e" Mar 19 09:28:42.282088 master-0 kubenswrapper[15202]: I0319 09:28:42.282057 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e"} err="failed to get container status \"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e\": rpc error: code = NotFound desc = could not find container \"3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e\": container with ID starting with 3b7004fd763d960d36f4bd9d0fc01640aa7b5da0fa66cf44ab2c726d3fea891e not found: ID does not exist" Mar 19 09:28:42.282088 master-0 kubenswrapper[15202]: I0319 09:28:42.282088 15202 scope.go:117] "RemoveContainer" containerID="909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33" Mar 19 09:28:42.282524 master-0 kubenswrapper[15202]: I0319 09:28:42.282446 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33"} err="failed to get container status \"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33\": rpc error: code = NotFound desc = could not find container \"909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33\": container with ID starting with 909915c3c1411a20c449d01e7669a10c6e2f7b046421716d1333fe949281fd33 not found: ID does not exist" Mar 19 09:28:42.308441 master-0 kubenswrapper[15202]: I0319 09:28:42.308386 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.308589 master-0 kubenswrapper[15202]: I0319 09:28:42.308488 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.308715 master-0 kubenswrapper[15202]: I0319 09:28:42.308651 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.308783 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.308838 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.308871 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.308956 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309010 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309060 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-config\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309117 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309357 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309393 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309416 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309461 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309601 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309645 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jht6c\" (UniqueName: \"kubernetes.io/projected/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-kube-api-access-jht6c\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309800 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.309899 master-0 kubenswrapper[15202]: I0319 09:28:42.309827 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412026 master-0 kubenswrapper[15202]: I0319 09:28:42.411973 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412168 master-0 kubenswrapper[15202]: I0319 09:28:42.412068 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412257 master-0 kubenswrapper[15202]: I0319 09:28:42.412218 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412317 master-0 kubenswrapper[15202]: I0319 09:28:42.412271 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412317 master-0 kubenswrapper[15202]: I0319 09:28:42.412310 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412419 master-0 kubenswrapper[15202]: I0319 09:28:42.412363 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412419 master-0 kubenswrapper[15202]: I0319 09:28:42.412410 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jht6c\" (UniqueName: \"kubernetes.io/projected/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-kube-api-access-jht6c\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412714 master-0 kubenswrapper[15202]: I0319 09:28:42.412675 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412771 master-0 kubenswrapper[15202]: I0319 09:28:42.412726 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412771 master-0 kubenswrapper[15202]: I0319 09:28:42.412759 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412952 master-0 kubenswrapper[15202]: I0319 09:28:42.412822 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412952 master-0 kubenswrapper[15202]: I0319 09:28:42.412858 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412952 master-0 kubenswrapper[15202]: I0319 09:28:42.412890 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.412952 master-0 kubenswrapper[15202]: I0319 09:28:42.412920 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.413074 master-0 kubenswrapper[15202]: I0319 09:28:42.412950 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.413074 master-0 kubenswrapper[15202]: I0319 09:28:42.412985 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.413074 master-0 kubenswrapper[15202]: I0319 09:28:42.413016 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.413074 master-0 kubenswrapper[15202]: I0319 09:28:42.413042 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-config\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.413200 master-0 kubenswrapper[15202]: I0319 09:28:42.413152 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.414135 master-0 kubenswrapper[15202]: I0319 09:28:42.414080 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.416965 master-0 kubenswrapper[15202]: I0319 09:28:42.416920 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.417898 master-0 kubenswrapper[15202]: I0319 09:28:42.417851 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.418404 master-0 kubenswrapper[15202]: I0319 09:28:42.418365 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.419230 master-0 kubenswrapper[15202]: I0319 09:28:42.419186 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.420251 master-0 kubenswrapper[15202]: I0319 09:28:42.420208 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.420658 master-0 kubenswrapper[15202]: I0319 09:28:42.420600 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.421019 master-0 kubenswrapper[15202]: I0319 09:28:42.420981 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.421281 master-0 kubenswrapper[15202]: I0319 09:28:42.421244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.421693 master-0 kubenswrapper[15202]: I0319 09:28:42.421666 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.422777 master-0 kubenswrapper[15202]: I0319 09:28:42.422724 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.424172 master-0 kubenswrapper[15202]: I0319 09:28:42.424137 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.424870 master-0 kubenswrapper[15202]: I0319 09:28:42.424806 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.426396 master-0 kubenswrapper[15202]: I0319 09:28:42.426356 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-config\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.426719 master-0 kubenswrapper[15202]: I0319 09:28:42.426653 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.428960 master-0 kubenswrapper[15202]: I0319 09:28:42.428884 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.438967 master-0 kubenswrapper[15202]: I0319 09:28:42.438908 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jht6c\" (UniqueName: \"kubernetes.io/projected/c07fbef0-2fa8-4240-8b80-0c96f3ca53c7-kube-api-access-jht6c\") pod \"prometheus-k8s-0\" (UID: \"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.517541 master-0 kubenswrapper[15202]: I0319 09:28:42.517367 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:42.834524 master-0 kubenswrapper[15202]: I0319 09:28:42.834318 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75eefc3a-d29d-499e-98fd-7292ff09c294" path="/var/lib/kubelet/pods/75eefc3a-d29d-499e-98fd-7292ff09c294/volumes" Mar 19 09:28:42.974996 master-0 kubenswrapper[15202]: I0319 09:28:42.974929 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:28:42.981548 master-0 kubenswrapper[15202]: W0319 09:28:42.981446 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07fbef0_2fa8_4240_8b80_0c96f3ca53c7.slice/crio-433151d625777c719adf1a2c2e542d905ce72932b97f718952ae02f405c3c867 WatchSource:0}: Error finding container 433151d625777c719adf1a2c2e542d905ce72932b97f718952ae02f405c3c867: Status 404 returned error can't find the container with id 433151d625777c719adf1a2c2e542d905ce72932b97f718952ae02f405c3c867 Mar 19 09:28:43.011948 master-0 kubenswrapper[15202]: I0319 09:28:43.011862 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"433151d625777c719adf1a2c2e542d905ce72932b97f718952ae02f405c3c867"} Mar 19 09:28:44.024182 master-0 kubenswrapper[15202]: I0319 09:28:44.024102 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" event={"ID":"fca3be47-3f1e-4b84-be7f-dffa5ce46d08","Type":"ContainerStarted","Data":"65b545f8eee72db4d26fc101c21dbb5c3be4f10200e0b20796ac360d53a18d5a"} Mar 19 09:28:44.030642 master-0 kubenswrapper[15202]: I0319 09:28:44.030574 15202 generic.go:334] "Generic (PLEG): container finished" podID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" containerID="b3f45ec8cb6b4de9881be953b3a34408aeb9a5bb3e549f6d4b63da8cb15c57d3" exitCode=0 Mar 19 09:28:44.030754 master-0 kubenswrapper[15202]: I0319 09:28:44.030673 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerDied","Data":"b3f45ec8cb6b4de9881be953b3a34408aeb9a5bb3e549f6d4b63da8cb15c57d3"} Mar 19 09:28:45.050104 master-0 kubenswrapper[15202]: I0319 09:28:45.049954 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" event={"ID":"fca3be47-3f1e-4b84-be7f-dffa5ce46d08","Type":"ContainerStarted","Data":"7a7fe84933ebdfc70862b83f6ad4dcba63d3e4b174cde58672e9f6e46fc52bef"} Mar 19 09:28:45.050794 master-0 kubenswrapper[15202]: I0319 09:28:45.050170 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" event={"ID":"fca3be47-3f1e-4b84-be7f-dffa5ce46d08","Type":"ContainerStarted","Data":"288429c0caa62895c71ff1410e06f07fadbd9a33eb0eb396ac12257a73a07cd6"} Mar 19 09:28:45.053884 master-0 kubenswrapper[15202]: I0319 09:28:45.053760 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:28:45.056542 master-0 kubenswrapper[15202]: I0319 09:28:45.056494 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:28:45.056681 master-0 kubenswrapper[15202]: I0319 09:28:45.056612 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"c86cc9c162aeced7e313c6cb48ece9c01e722b02190954f661f98f989594007e"} Mar 19 09:28:45.056759 master-0 kubenswrapper[15202]: I0319 09:28:45.056682 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.056759 master-0 kubenswrapper[15202]: I0319 09:28:45.056690 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"c3856e2f2a3e19138f7ddcf54d01656122fb6fc3a12e89a3282f31da2121c8e6"} Mar 19 09:28:45.056922 master-0 kubenswrapper[15202]: I0319 09:28:45.056771 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:28:45.056998 master-0 kubenswrapper[15202]: I0319 09:28:45.056911 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver" containerID="cri-o://3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788" gracePeriod=15 Mar 19 09:28:45.057172 master-0 kubenswrapper[15202]: I0319 09:28:45.057124 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15" gracePeriod=15 Mar 19 09:28:45.057257 master-0 kubenswrapper[15202]: I0319 09:28:45.057194 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486" gracePeriod=15 Mar 19 09:28:45.057257 master-0 kubenswrapper[15202]: I0319 09:28:45.057240 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84" gracePeriod=15 Mar 19 09:28:45.057404 master-0 kubenswrapper[15202]: I0319 09:28:45.057302 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a" gracePeriod=15 Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057706 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057732 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057752 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="setup" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057761 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="setup" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057786 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-syncer" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057794 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-syncer" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057809 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057818 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057837 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-insecure-readyz" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057847 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-insecure-readyz" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057862 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057870 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: E0319 09:28:45.057908 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" Mar 19 09:28:45.057898 master-0 kubenswrapper[15202]: I0319 09:28:45.057918 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" Mar 19 09:28:45.058803 master-0 kubenswrapper[15202]: I0319 09:28:45.058084 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-insecure-readyz" Mar 19 09:28:45.058803 master-0 kubenswrapper[15202]: I0319 09:28:45.058106 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-syncer" Mar 19 09:28:45.058803 master-0 kubenswrapper[15202]: I0319 09:28:45.058133 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:28:45.058803 master-0 kubenswrapper[15202]: I0319 09:28:45.058149 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver" Mar 19 09:28:45.058803 master-0 kubenswrapper[15202]: I0319 09:28:45.058167 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" Mar 19 09:28:45.058803 master-0 kubenswrapper[15202]: I0319 09:28:45.058184 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1511182fa3564db9f50c25912cc22f" containerName="kube-apiserver-check-endpoints" Mar 19 09:28:45.063282 master-0 kubenswrapper[15202]: I0319 09:28:45.063175 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"e1a29b2d8fdb10f6fc75826aeaab0d974bc2d4a82ae2be1bb41b890fbf49ff09"} Mar 19 09:28:45.063282 master-0 kubenswrapper[15202]: I0319 09:28:45.063280 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"50da9a261155f511f71125e9d3077f1b9df4244530f3376129ede4a254faf30a"} Mar 19 09:28:45.063568 master-0 kubenswrapper[15202]: I0319 09:28:45.063304 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"bd4a14b4f10aa909f25487701e446a017272c60141489f807f56cc3523e0bdcd"} Mar 19 09:28:45.135670 master-0 kubenswrapper[15202]: I0319 09:28:45.135313 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7a1511182fa3564db9f50c25912cc22f" podUID="7d5ce05b3d592e63f1f92202d52b9635" Mar 19 09:28:45.141231 master-0 kubenswrapper[15202]: I0319 09:28:45.140998 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-678cbbd786-bf7l4" podStartSLOduration=2.790227313 podStartE2EDuration="5.140948365s" podCreationTimestamp="2026-03-19 09:28:40 +0000 UTC" firstStartedPulling="2026-03-19 09:28:41.439553332 +0000 UTC m=+238.824968148" lastFinishedPulling="2026-03-19 09:28:43.790274384 +0000 UTC m=+241.175689200" observedRunningTime="2026-03-19 09:28:45.125091893 +0000 UTC m=+242.510506739" watchObservedRunningTime="2026-03-19 09:28:45.140948365 +0000 UTC m=+242.526363191" Mar 19 09:28:45.143128 master-0 kubenswrapper[15202]: I0319 09:28:45.143068 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:28:45.190542 master-0 kubenswrapper[15202]: I0319 09:28:45.190455 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.190998 master-0 kubenswrapper[15202]: I0319 09:28:45.190965 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.191096 master-0 kubenswrapper[15202]: I0319 09:28:45.191025 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.191096 master-0 kubenswrapper[15202]: I0319 09:28:45.191063 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.191096 master-0 kubenswrapper[15202]: I0319 09:28:45.191086 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.191241 master-0 kubenswrapper[15202]: I0319 09:28:45.191108 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.191776 master-0 kubenswrapper[15202]: I0319 09:28:45.191398 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.191776 master-0 kubenswrapper[15202]: I0319 09:28:45.191505 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.283002 master-0 kubenswrapper[15202]: E0319 09:28:45.282861 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{prometheus-k8s-0.189e340b0c0a9c32 openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:prometheus-k8s-0,UID:c07fbef0-2fa8-4240-8b80-0c96f3ca53c7,APIVersion:v1,ResourceVersion:14295,FieldPath:spec.containers{kube-rbac-proxy-thanos},},Reason:Created,Message:Created container: kube-rbac-proxy-thanos,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:45.281860658 +0000 UTC m=+242.667275474,LastTimestamp:2026-03-19 09:28:45.281860658 +0000 UTC m=+242.667275474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:28:45.293905 master-0 kubenswrapper[15202]: I0319 09:28:45.293757 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.294059 master-0 kubenswrapper[15202]: I0319 09:28:45.294030 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.294342 master-0 kubenswrapper[15202]: I0319 09:28:45.294306 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.294426 master-0 kubenswrapper[15202]: I0319 09:28:45.294394 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.294676 master-0 kubenswrapper[15202]: I0319 09:28:45.294640 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.294830 master-0 kubenswrapper[15202]: I0319 09:28:45.294798 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.294913 master-0 kubenswrapper[15202]: I0319 09:28:45.294887 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.295000 master-0 kubenswrapper[15202]: I0319 09:28:45.294973 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.295309 master-0 kubenswrapper[15202]: I0319 09:28:45.295273 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.295413 master-0 kubenswrapper[15202]: I0319 09:28:45.295383 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.295525 master-0 kubenswrapper[15202]: I0319 09:28:45.295452 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.295612 master-0 kubenswrapper[15202]: I0319 09:28:45.295565 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.295659 master-0 kubenswrapper[15202]: I0319 09:28:45.295638 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.295742 master-0 kubenswrapper[15202]: I0319 09:28:45.295707 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.295813 master-0 kubenswrapper[15202]: I0319 09:28:45.295785 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:45.295898 master-0 kubenswrapper[15202]: I0319 09:28:45.295865 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:45.437378 master-0 kubenswrapper[15202]: I0319 09:28:45.437294 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:28:46.070294 master-0 kubenswrapper[15202]: I0319 09:28:46.070217 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-check-endpoints/0.log" Mar 19 09:28:46.073406 master-0 kubenswrapper[15202]: I0319 09:28:46.073355 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-cert-syncer/0.log" Mar 19 09:28:46.074755 master-0 kubenswrapper[15202]: I0319 09:28:46.074673 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15" exitCode=0 Mar 19 09:28:46.074755 master-0 kubenswrapper[15202]: I0319 09:28:46.074744 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486" exitCode=0 Mar 19 09:28:46.074897 master-0 kubenswrapper[15202]: I0319 09:28:46.074763 15202 scope.go:117] "RemoveContainer" containerID="21a9ca68aca58418f611d967784b8b2e15b3acfa4bde8394a7537d1e53b9f6af" Mar 19 09:28:46.074897 master-0 kubenswrapper[15202]: I0319 09:28:46.074773 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84" exitCode=0 Mar 19 09:28:46.074897 master-0 kubenswrapper[15202]: I0319 09:28:46.074883 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a" exitCode=2 Mar 19 09:28:46.083384 master-0 kubenswrapper[15202]: I0319 09:28:46.083303 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c07fbef0-2fa8-4240-8b80-0c96f3ca53c7","Type":"ContainerStarted","Data":"a507965f9a904cc54bf5bc57df899a1614369f125ea19bac90495f54330c6661"} Mar 19 09:28:46.085499 master-0 kubenswrapper[15202]: I0319 09:28:46.085407 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:46.086074 master-0 kubenswrapper[15202]: I0319 09:28:46.086017 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"16fb4ea7f83036d9c6adf3454fc7e9db","Type":"ContainerStarted","Data":"28d21e26324d8f2be58c9073e73edf161f575e4263dc7b071b5e8f96cd46fdee"} Mar 19 09:28:46.086074 master-0 kubenswrapper[15202]: I0319 09:28:46.086072 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"16fb4ea7f83036d9c6adf3454fc7e9db","Type":"ContainerStarted","Data":"1a4d3275a4cab69f1551fb9287df85b0819d93bc0c28b288f459aa0a7204127c"} Mar 19 09:28:46.086799 master-0 kubenswrapper[15202]: I0319 09:28:46.086711 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:46.093298 master-0 kubenswrapper[15202]: I0319 09:28:46.093161 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:46.094586 master-0 kubenswrapper[15202]: I0319 09:28:46.094517 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:47.101555 master-0 kubenswrapper[15202]: I0319 09:28:47.101457 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-cert-syncer/0.log" Mar 19 09:28:47.518918 master-0 kubenswrapper[15202]: I0319 09:28:47.518626 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:28:47.758859 master-0 kubenswrapper[15202]: I0319 09:28:47.758789 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-cert-syncer/0.log" Mar 19 09:28:47.759701 master-0 kubenswrapper[15202]: I0319 09:28:47.759672 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:47.761697 master-0 kubenswrapper[15202]: I0319 09:28:47.761625 15202 status_manager.go:851] "Failed to get status for pod" podUID="7a1511182fa3564db9f50c25912cc22f" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:47.762549 master-0 kubenswrapper[15202]: I0319 09:28:47.762475 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:47.763817 master-0 kubenswrapper[15202]: I0319 09:28:47.763700 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:47.864764 master-0 kubenswrapper[15202]: I0319 09:28:47.864547 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") pod \"7a1511182fa3564db9f50c25912cc22f\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " Mar 19 09:28:47.865072 master-0 kubenswrapper[15202]: I0319 09:28:47.864768 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7a1511182fa3564db9f50c25912cc22f" (UID: "7a1511182fa3564db9f50c25912cc22f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:47.865072 master-0 kubenswrapper[15202]: I0319 09:28:47.864851 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") pod \"7a1511182fa3564db9f50c25912cc22f\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " Mar 19 09:28:47.865072 master-0 kubenswrapper[15202]: I0319 09:28:47.864930 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") pod \"7a1511182fa3564db9f50c25912cc22f\" (UID: \"7a1511182fa3564db9f50c25912cc22f\") " Mar 19 09:28:47.865267 master-0 kubenswrapper[15202]: I0319 09:28:47.865036 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7a1511182fa3564db9f50c25912cc22f" (UID: "7a1511182fa3564db9f50c25912cc22f"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:47.865267 master-0 kubenswrapper[15202]: I0319 09:28:47.865101 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "7a1511182fa3564db9f50c25912cc22f" (UID: "7a1511182fa3564db9f50c25912cc22f"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:47.866087 master-0 kubenswrapper[15202]: I0319 09:28:47.866021 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:47.866164 master-0 kubenswrapper[15202]: I0319 09:28:47.866104 15202 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:47.866164 master-0 kubenswrapper[15202]: I0319 09:28:47.866134 15202 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7a1511182fa3564db9f50c25912cc22f-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:48.117549 master-0 kubenswrapper[15202]: I0319 09:28:48.117342 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7a1511182fa3564db9f50c25912cc22f/kube-apiserver-cert-syncer/0.log" Mar 19 09:28:48.119038 master-0 kubenswrapper[15202]: I0319 09:28:48.118958 15202 generic.go:334] "Generic (PLEG): container finished" podID="7a1511182fa3564db9f50c25912cc22f" containerID="3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788" exitCode=0 Mar 19 09:28:48.120808 master-0 kubenswrapper[15202]: I0319 09:28:48.120685 15202 scope.go:117] "RemoveContainer" containerID="3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15" Mar 19 09:28:48.137810 master-0 kubenswrapper[15202]: I0319 09:28:48.137692 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:48.151202 master-0 kubenswrapper[15202]: I0319 09:28:48.151150 15202 scope.go:117] "RemoveContainer" containerID="e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486" Mar 19 09:28:48.167432 master-0 kubenswrapper[15202]: I0319 09:28:48.167380 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:48.168217 master-0 kubenswrapper[15202]: I0319 09:28:48.168194 15202 status_manager.go:851] "Failed to get status for pod" podUID="7a1511182fa3564db9f50c25912cc22f" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:48.168842 master-0 kubenswrapper[15202]: I0319 09:28:48.168752 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:48.181947 master-0 kubenswrapper[15202]: I0319 09:28:48.181863 15202 scope.go:117] "RemoveContainer" containerID="15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84" Mar 19 09:28:48.202628 master-0 kubenswrapper[15202]: I0319 09:28:48.202510 15202 scope.go:117] "RemoveContainer" containerID="b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a" Mar 19 09:28:48.234663 master-0 kubenswrapper[15202]: I0319 09:28:48.234583 15202 scope.go:117] "RemoveContainer" containerID="3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788" Mar 19 09:28:48.261309 master-0 kubenswrapper[15202]: I0319 09:28:48.261243 15202 scope.go:117] "RemoveContainer" containerID="b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce" Mar 19 09:28:48.280936 master-0 kubenswrapper[15202]: I0319 09:28:48.280883 15202 scope.go:117] "RemoveContainer" containerID="3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15" Mar 19 09:28:48.281587 master-0 kubenswrapper[15202]: E0319 09:28:48.281535 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15\": container with ID starting with 3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15 not found: ID does not exist" containerID="3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15" Mar 19 09:28:48.281731 master-0 kubenswrapper[15202]: I0319 09:28:48.281697 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15"} err="failed to get container status \"3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15\": rpc error: code = NotFound desc = could not find container \"3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15\": container with ID starting with 3d23e4bb677395e551dda1c73f17e071f2400660ec2fc74d913c29f6812b2f15 not found: ID does not exist" Mar 19 09:28:48.281852 master-0 kubenswrapper[15202]: I0319 09:28:48.281832 15202 scope.go:117] "RemoveContainer" containerID="e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486" Mar 19 09:28:48.283187 master-0 kubenswrapper[15202]: E0319 09:28:48.282863 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486\": container with ID starting with e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486 not found: ID does not exist" containerID="e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486" Mar 19 09:28:48.283187 master-0 kubenswrapper[15202]: I0319 09:28:48.282932 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486"} err="failed to get container status \"e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486\": rpc error: code = NotFound desc = could not find container \"e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486\": container with ID starting with e4b055ccf670a9233e055f94e3860a15f8605fe89eba1a5671159a207b334486 not found: ID does not exist" Mar 19 09:28:48.283187 master-0 kubenswrapper[15202]: I0319 09:28:48.282975 15202 scope.go:117] "RemoveContainer" containerID="15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84" Mar 19 09:28:48.283401 master-0 kubenswrapper[15202]: E0319 09:28:48.283356 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84\": container with ID starting with 15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84 not found: ID does not exist" containerID="15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84" Mar 19 09:28:48.283401 master-0 kubenswrapper[15202]: I0319 09:28:48.283387 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84"} err="failed to get container status \"15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84\": rpc error: code = NotFound desc = could not find container \"15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84\": container with ID starting with 15bd5044f1a27452a13b2861705316551071c98b6a52a353d455ebc4cbe96a84 not found: ID does not exist" Mar 19 09:28:48.283754 master-0 kubenswrapper[15202]: I0319 09:28:48.283405 15202 scope.go:117] "RemoveContainer" containerID="b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a" Mar 19 09:28:48.284326 master-0 kubenswrapper[15202]: E0319 09:28:48.284273 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a\": container with ID starting with b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a not found: ID does not exist" containerID="b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a" Mar 19 09:28:48.284397 master-0 kubenswrapper[15202]: I0319 09:28:48.284312 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a"} err="failed to get container status \"b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a\": rpc error: code = NotFound desc = could not find container \"b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a\": container with ID starting with b1b80eb9a4eb68b0ff28c6087ac3d7e825472e98483df50ac50e7ce47c405f0a not found: ID does not exist" Mar 19 09:28:48.284397 master-0 kubenswrapper[15202]: I0319 09:28:48.284362 15202 scope.go:117] "RemoveContainer" containerID="3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788" Mar 19 09:28:48.285772 master-0 kubenswrapper[15202]: E0319 09:28:48.285187 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788\": container with ID starting with 3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788 not found: ID does not exist" containerID="3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788" Mar 19 09:28:48.285772 master-0 kubenswrapper[15202]: I0319 09:28:48.285216 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788"} err="failed to get container status \"3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788\": rpc error: code = NotFound desc = could not find container \"3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788\": container with ID starting with 3100a8ef496d9b04d4a7adf9ec6041faf9cd3bbbf6cab7f6faf2adda976b9788 not found: ID does not exist" Mar 19 09:28:48.285772 master-0 kubenswrapper[15202]: I0319 09:28:48.285234 15202 scope.go:117] "RemoveContainer" containerID="b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce" Mar 19 09:28:48.285772 master-0 kubenswrapper[15202]: E0319 09:28:48.285704 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce\": container with ID starting with b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce not found: ID does not exist" containerID="b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce" Mar 19 09:28:48.285772 master-0 kubenswrapper[15202]: I0319 09:28:48.285740 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce"} err="failed to get container status \"b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce\": rpc error: code = NotFound desc = could not find container \"b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce\": container with ID starting with b74d28610f66e4fc97f2a752fa51c4d999efd52be13d38e6893b8037a584a6ce not found: ID does not exist" Mar 19 09:28:48.824298 master-0 kubenswrapper[15202]: I0319 09:28:48.824210 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1511182fa3564db9f50c25912cc22f" path="/var/lib/kubelet/pods/7a1511182fa3564db9f50c25912cc22f/volumes" Mar 19 09:28:50.602137 master-0 kubenswrapper[15202]: I0319 09:28:50.601965 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:28:50.603003 master-0 kubenswrapper[15202]: I0319 09:28:50.602177 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:28:51.152286 master-0 kubenswrapper[15202]: I0319 09:28:51.152198 15202 generic.go:334] "Generic (PLEG): container finished" podID="86b617be-3cae-4fea-bba7-199de3e4ecf6" containerID="355cd89078453b5d440100fc82d5481b6a468e22fd22088f1a704fff3b57820e" exitCode=0 Mar 19 09:28:51.152286 master-0 kubenswrapper[15202]: I0319 09:28:51.152277 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"86b617be-3cae-4fea-bba7-199de3e4ecf6","Type":"ContainerDied","Data":"355cd89078453b5d440100fc82d5481b6a468e22fd22088f1a704fff3b57820e"} Mar 19 09:28:51.153495 master-0 kubenswrapper[15202]: I0319 09:28:51.153428 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:51.154209 master-0 kubenswrapper[15202]: I0319 09:28:51.154108 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:51.155166 master-0 kubenswrapper[15202]: I0319 09:28:51.155100 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:51.656904 master-0 kubenswrapper[15202]: I0319 09:28:51.656843 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:28:51.657738 master-0 kubenswrapper[15202]: I0319 09:28:51.656923 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:28:52.358533 master-0 kubenswrapper[15202]: E0319 09:28:52.358401 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.359687 master-0 kubenswrapper[15202]: E0319 09:28:52.359626 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.360601 master-0 kubenswrapper[15202]: E0319 09:28:52.360551 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.361698 master-0 kubenswrapper[15202]: E0319 09:28:52.361632 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.362523 master-0 kubenswrapper[15202]: E0319 09:28:52.362392 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.362523 master-0 kubenswrapper[15202]: I0319 09:28:52.362500 15202 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:28:52.363180 master-0 kubenswrapper[15202]: E0319 09:28:52.363133 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:28:52.565046 master-0 kubenswrapper[15202]: E0319 09:28:52.564943 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:28:52.583022 master-0 kubenswrapper[15202]: I0319 09:28:52.582954 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:28:52.584145 master-0 kubenswrapper[15202]: I0319 09:28:52.584089 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.584688 master-0 kubenswrapper[15202]: I0319 09:28:52.584643 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.585127 master-0 kubenswrapper[15202]: I0319 09:28:52.585073 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.680074 master-0 kubenswrapper[15202]: I0319 09:28:52.679997 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-var-lock\") pod \"86b617be-3cae-4fea-bba7-199de3e4ecf6\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " Mar 19 09:28:52.680074 master-0 kubenswrapper[15202]: I0319 09:28:52.680080 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-kubelet-dir\") pod \"86b617be-3cae-4fea-bba7-199de3e4ecf6\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " Mar 19 09:28:52.680897 master-0 kubenswrapper[15202]: I0319 09:28:52.680140 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86b617be-3cae-4fea-bba7-199de3e4ecf6-kube-api-access\") pod \"86b617be-3cae-4fea-bba7-199de3e4ecf6\" (UID: \"86b617be-3cae-4fea-bba7-199de3e4ecf6\") " Mar 19 09:28:52.680897 master-0 kubenswrapper[15202]: I0319 09:28:52.680220 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-var-lock" (OuterVolumeSpecName: "var-lock") pod "86b617be-3cae-4fea-bba7-199de3e4ecf6" (UID: "86b617be-3cae-4fea-bba7-199de3e4ecf6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:52.680897 master-0 kubenswrapper[15202]: I0319 09:28:52.680299 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "86b617be-3cae-4fea-bba7-199de3e4ecf6" (UID: "86b617be-3cae-4fea-bba7-199de3e4ecf6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:52.680897 master-0 kubenswrapper[15202]: I0319 09:28:52.680589 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:52.680897 master-0 kubenswrapper[15202]: I0319 09:28:52.680612 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86b617be-3cae-4fea-bba7-199de3e4ecf6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:52.685029 master-0 kubenswrapper[15202]: I0319 09:28:52.684979 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b617be-3cae-4fea-bba7-199de3e4ecf6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "86b617be-3cae-4fea-bba7-199de3e4ecf6" (UID: "86b617be-3cae-4fea-bba7-199de3e4ecf6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:52.776420 master-0 kubenswrapper[15202]: E0319 09:28:52.776127 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{prometheus-k8s-0.189e340b0c0a9c32 openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:prometheus-k8s-0,UID:c07fbef0-2fa8-4240-8b80-0c96f3ca53c7,APIVersion:v1,ResourceVersion:14295,FieldPath:spec.containers{kube-rbac-proxy-thanos},},Reason:Created,Message:Created container: kube-rbac-proxy-thanos,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:28:45.281860658 +0000 UTC m=+242.667275474,LastTimestamp:2026-03-19 09:28:45.281860658 +0000 UTC m=+242.667275474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:28:52.782510 master-0 kubenswrapper[15202]: I0319 09:28:52.782402 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86b617be-3cae-4fea-bba7-199de3e4ecf6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:28:52.822218 master-0 kubenswrapper[15202]: I0319 09:28:52.821821 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.823755 master-0 kubenswrapper[15202]: I0319 09:28:52.823681 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.824628 master-0 kubenswrapper[15202]: I0319 09:28:52.824562 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:52.966759 master-0 kubenswrapper[15202]: E0319 09:28:52.966640 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:28:53.175525 master-0 kubenswrapper[15202]: I0319 09:28:53.175263 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"86b617be-3cae-4fea-bba7-199de3e4ecf6","Type":"ContainerDied","Data":"bf817d255fbe35689e89ade319171e62303ff41aeeae5c5e4b687a11f31215bd"} Mar 19 09:28:53.175525 master-0 kubenswrapper[15202]: I0319 09:28:53.175335 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf817d255fbe35689e89ade319171e62303ff41aeeae5c5e4b687a11f31215bd" Mar 19 09:28:53.175525 master-0 kubenswrapper[15202]: I0319 09:28:53.175372 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 09:28:53.181811 master-0 kubenswrapper[15202]: I0319 09:28:53.181695 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:53.182782 master-0 kubenswrapper[15202]: I0319 09:28:53.182703 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:53.183629 master-0 kubenswrapper[15202]: I0319 09:28:53.183557 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:53.768348 master-0 kubenswrapper[15202]: E0319 09:28:53.768207 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:28:55.370684 master-0 kubenswrapper[15202]: E0319 09:28:55.370579 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:28:56.207264 master-0 kubenswrapper[15202]: I0319 09:28:56.207180 15202 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="92a2db24929eebeb86c10e4da2210d08ce4c067d7696a9c259054e240344e6fa" exitCode=1 Mar 19 09:28:56.207264 master-0 kubenswrapper[15202]: I0319 09:28:56.207262 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"92a2db24929eebeb86c10e4da2210d08ce4c067d7696a9c259054e240344e6fa"} Mar 19 09:28:56.207597 master-0 kubenswrapper[15202]: I0319 09:28:56.207320 15202 scope.go:117] "RemoveContainer" containerID="6081e5f52de3fc4dc3f746460dde01bf5beff21d46d2be6b213ee24cc51a7282" Mar 19 09:28:56.208297 master-0 kubenswrapper[15202]: I0319 09:28:56.208065 15202 scope.go:117] "RemoveContainer" containerID="92a2db24929eebeb86c10e4da2210d08ce4c067d7696a9c259054e240344e6fa" Mar 19 09:28:56.208368 master-0 kubenswrapper[15202]: E0319 09:28:56.208346 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(c83737980b9ee109184b1d78e942cf36)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="c83737980b9ee109184b1d78e942cf36" Mar 19 09:28:56.208861 master-0 kubenswrapper[15202]: I0319 09:28:56.208783 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:56.210607 master-0 kubenswrapper[15202]: I0319 09:28:56.209697 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:56.211176 master-0 kubenswrapper[15202]: I0319 09:28:56.211115 15202 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:56.213973 master-0 kubenswrapper[15202]: I0319 09:28:56.213909 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:58.573912 master-0 kubenswrapper[15202]: E0319 09:28:58.573758 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:28:58.811952 master-0 kubenswrapper[15202]: I0319 09:28:58.811879 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:58.814422 master-0 kubenswrapper[15202]: I0319 09:28:58.814307 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:58.815789 master-0 kubenswrapper[15202]: I0319 09:28:58.815623 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:58.816878 master-0 kubenswrapper[15202]: I0319 09:28:58.816788 15202 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:58.817841 master-0 kubenswrapper[15202]: I0319 09:28:58.817770 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:28:58.838455 master-0 kubenswrapper[15202]: I0319 09:28:58.838373 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:28:58.838455 master-0 kubenswrapper[15202]: I0319 09:28:58.838442 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:28:58.839875 master-0 kubenswrapper[15202]: E0319 09:28:58.839808 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:58.840830 master-0 kubenswrapper[15202]: I0319 09:28:58.840734 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:28:58.871569 master-0 kubenswrapper[15202]: W0319 09:28:58.871455 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5ce05b3d592e63f1f92202d52b9635.slice/crio-4fbecf68f221ad2d3e7402d579f38e9cd8f3b24bb02fab19af146ec67f679b84 WatchSource:0}: Error finding container 4fbecf68f221ad2d3e7402d579f38e9cd8f3b24bb02fab19af146ec67f679b84: Status 404 returned error can't find the container with id 4fbecf68f221ad2d3e7402d579f38e9cd8f3b24bb02fab19af146ec67f679b84 Mar 19 09:28:59.246816 master-0 kubenswrapper[15202]: I0319 09:28:59.246672 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648"} Mar 19 09:28:59.246816 master-0 kubenswrapper[15202]: I0319 09:28:59.246765 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"4fbecf68f221ad2d3e7402d579f38e9cd8f3b24bb02fab19af146ec67f679b84"} Mar 19 09:29:00.258689 master-0 kubenswrapper[15202]: I0319 09:29:00.258533 15202 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" exitCode=1 Mar 19 09:29:00.258689 master-0 kubenswrapper[15202]: I0319 09:29:00.258599 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411"} Mar 19 09:29:00.259848 master-0 kubenswrapper[15202]: I0319 09:29:00.258724 15202 scope.go:117] "RemoveContainer" containerID="8c6bf6e4dc06dc33ce2a60a0abd7d0a106b6973ee1336f65f910e0cb73c9c346" Mar 19 09:29:00.260531 master-0 kubenswrapper[15202]: I0319 09:29:00.260417 15202 scope.go:117] "RemoveContainer" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" Mar 19 09:29:00.260742 master-0 kubenswrapper[15202]: I0319 09:29:00.260660 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.261041 master-0 kubenswrapper[15202]: E0319 09:29:00.260981 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 09:29:00.261982 master-0 kubenswrapper[15202]: I0319 09:29:00.261884 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.264409 master-0 kubenswrapper[15202]: I0319 09:29:00.262530 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648" exitCode=0 Mar 19 09:29:00.264409 master-0 kubenswrapper[15202]: I0319 09:29:00.262555 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerDied","Data":"d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648"} Mar 19 09:29:00.264636 master-0 kubenswrapper[15202]: I0319 09:29:00.262960 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:00.264636 master-0 kubenswrapper[15202]: I0319 09:29:00.264465 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:00.265102 master-0 kubenswrapper[15202]: E0319 09:29:00.265036 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:00.265347 master-0 kubenswrapper[15202]: I0319 09:29:00.265287 15202 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.266159 master-0 kubenswrapper[15202]: I0319 09:29:00.266087 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.266965 master-0 kubenswrapper[15202]: I0319 09:29:00.266888 15202 status_manager.go:851] "Failed to get status for pod" podUID="46f265536aba6292ead501bc9b49f327" pod="kube-system/bootstrap-kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.267980 master-0 kubenswrapper[15202]: I0319 09:29:00.267907 15202 status_manager.go:851] "Failed to get status for pod" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.268915 master-0 kubenswrapper[15202]: I0319 09:29:00.268831 15202 status_manager.go:851] "Failed to get status for pod" podUID="46f265536aba6292ead501bc9b49f327" pod="kube-system/bootstrap-kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.269839 master-0 kubenswrapper[15202]: I0319 09:29:00.269771 15202 status_manager.go:851] "Failed to get status for pod" podUID="c07fbef0-2fa8-4240-8b80-0c96f3ca53c7" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.270533 master-0 kubenswrapper[15202]: I0319 09:29:00.270457 15202 status_manager.go:851] "Failed to get status for pod" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.271002 master-0 kubenswrapper[15202]: I0319 09:29:00.270949 15202 status_manager.go:851] "Failed to get status for pod" podUID="c83737980b9ee109184b1d78e942cf36" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:29:00.416917 master-0 kubenswrapper[15202]: I0319 09:29:00.416840 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:00.444623 master-0 kubenswrapper[15202]: I0319 09:29:00.444554 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:00.603045 master-0 kubenswrapper[15202]: I0319 09:29:00.602951 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:29:00.603356 master-0 kubenswrapper[15202]: I0319 09:29:00.603047 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:29:01.281385 master-0 kubenswrapper[15202]: I0319 09:29:01.281277 15202 scope.go:117] "RemoveContainer" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" Mar 19 09:29:01.282630 master-0 kubenswrapper[15202]: E0319 09:29:01.282089 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 09:29:01.286522 master-0 kubenswrapper[15202]: I0319 09:29:01.286397 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494"} Mar 19 09:29:01.286522 master-0 kubenswrapper[15202]: I0319 09:29:01.286525 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120"} Mar 19 09:29:01.656136 master-0 kubenswrapper[15202]: I0319 09:29:01.656066 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:29:01.656414 master-0 kubenswrapper[15202]: I0319 09:29:01.656161 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:29:02.274318 master-0 kubenswrapper[15202]: I0319 09:29:02.274226 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:02.301694 master-0 kubenswrapper[15202]: I0319 09:29:02.301584 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1"} Mar 19 09:29:02.302543 master-0 kubenswrapper[15202]: I0319 09:29:02.302247 15202 scope.go:117] "RemoveContainer" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" Mar 19 09:29:02.306492 master-0 kubenswrapper[15202]: E0319 09:29:02.302585 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 09:29:03.313421 master-0 kubenswrapper[15202]: I0319 09:29:03.313321 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"63d59fc46d669df5a337070a35f5b53eb2b46e5fc1749cd3250ef710c1c9446e"} Mar 19 09:29:03.313421 master-0 kubenswrapper[15202]: I0319 09:29:03.313403 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381"} Mar 19 09:29:03.314346 master-0 kubenswrapper[15202]: I0319 09:29:03.313743 15202 scope.go:117] "RemoveContainer" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" Mar 19 09:29:03.314346 master-0 kubenswrapper[15202]: I0319 09:29:03.313902 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:03.314346 master-0 kubenswrapper[15202]: I0319 09:29:03.313959 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:03.314346 master-0 kubenswrapper[15202]: E0319 09:29:03.314030 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(46f265536aba6292ead501bc9b49f327)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" Mar 19 09:29:03.842021 master-0 kubenswrapper[15202]: I0319 09:29:03.841901 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:03.842021 master-0 kubenswrapper[15202]: I0319 09:29:03.841980 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: I0319 09:29:03.847396 15202 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]log ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]etcd ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiextensions-informers ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/crd-informer-synced ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/bootstrap-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-registration-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]autoregister-completion ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: livez check failed Mar 19 09:29:03.847502 master-0 kubenswrapper[15202]: I0319 09:29:03.847496 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:29:08.842235 master-0 kubenswrapper[15202]: I0319 09:29:08.842128 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:09.611728 master-0 kubenswrapper[15202]: I0319 09:29:09.611637 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:10.191090 master-0 kubenswrapper[15202]: I0319 09:29:10.191044 15202 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:10.379826 master-0 kubenswrapper[15202]: I0319 09:29:10.379546 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-check-endpoints/0.log" Mar 19 09:29:10.382647 master-0 kubenswrapper[15202]: I0319 09:29:10.382601 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="63d59fc46d669df5a337070a35f5b53eb2b46e5fc1749cd3250ef710c1c9446e" exitCode=255 Mar 19 09:29:10.382719 master-0 kubenswrapper[15202]: I0319 09:29:10.382677 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerDied","Data":"63d59fc46d669df5a337070a35f5b53eb2b46e5fc1749cd3250ef710c1c9446e"} Mar 19 09:29:10.602689 master-0 kubenswrapper[15202]: I0319 09:29:10.602441 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:29:10.602689 master-0 kubenswrapper[15202]: I0319 09:29:10.602604 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:29:10.812998 master-0 kubenswrapper[15202]: I0319 09:29:10.812908 15202 scope.go:117] "RemoveContainer" containerID="92a2db24929eebeb86c10e4da2210d08ce4c067d7696a9c259054e240344e6fa" Mar 19 09:29:10.992439 master-0 kubenswrapper[15202]: I0319 09:29:10.992374 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:29:11.397310 master-0 kubenswrapper[15202]: I0319 09:29:11.397046 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"5dc0d234726aa7e09b7e006830783386358cf8a6aeab0626a8eb7cfa7413be9f"} Mar 19 09:29:11.397996 master-0 kubenswrapper[15202]: I0319 09:29:11.397530 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:11.397996 master-0 kubenswrapper[15202]: I0319 09:29:11.397569 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:11.405912 master-0 kubenswrapper[15202]: I0319 09:29:11.405860 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:11.434589 master-0 kubenswrapper[15202]: I0319 09:29:11.434118 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:29:11.437171 master-0 kubenswrapper[15202]: I0319 09:29:11.436608 15202 scope.go:117] "RemoveContainer" containerID="63d59fc46d669df5a337070a35f5b53eb2b46e5fc1749cd3250ef710c1c9446e" Mar 19 09:29:11.656974 master-0 kubenswrapper[15202]: I0319 09:29:11.656781 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:29:11.656974 master-0 kubenswrapper[15202]: I0319 09:29:11.656897 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:29:12.407668 master-0 kubenswrapper[15202]: I0319 09:29:12.407610 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-check-endpoints/0.log" Mar 19 09:29:12.411371 master-0 kubenswrapper[15202]: I0319 09:29:12.411333 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f"} Mar 19 09:29:12.411836 master-0 kubenswrapper[15202]: I0319 09:29:12.411813 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:12.412080 master-0 kubenswrapper[15202]: I0319 09:29:12.412011 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:12.412080 master-0 kubenswrapper[15202]: I0319 09:29:12.412081 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:12.417625 master-0 kubenswrapper[15202]: I0319 09:29:12.417553 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:29:13.421052 master-0 kubenswrapper[15202]: I0319 09:29:13.420954 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:13.421052 master-0 kubenswrapper[15202]: I0319 09:29:13.421012 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="89d5225e-3c24-4e60-9de9-b2b3714dfe15" Mar 19 09:29:13.425082 master-0 kubenswrapper[15202]: I0319 09:29:13.425032 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:29:14.811705 master-0 kubenswrapper[15202]: I0319 09:29:14.811589 15202 scope.go:117] "RemoveContainer" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" Mar 19 09:29:15.443496 master-0 kubenswrapper[15202]: I0319 09:29:15.443398 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"c07894aa55def3d2147701356df1f2900a277d1378259aefae49e515291dc919"} Mar 19 09:29:17.268513 master-0 kubenswrapper[15202]: I0319 09:29:17.268415 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:29:17.499109 master-0 kubenswrapper[15202]: I0319 09:29:17.498998 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:29:17.754900 master-0 kubenswrapper[15202]: I0319 09:29:17.754796 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:29:17.769232 master-0 kubenswrapper[15202]: I0319 09:29:17.769149 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:29:17.863735 master-0 kubenswrapper[15202]: I0319 09:29:17.863675 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:29:18.033556 master-0 kubenswrapper[15202]: I0319 09:29:18.033332 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:29:18.063393 master-0 kubenswrapper[15202]: I0319 09:29:18.063337 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:29:18.127283 master-0 kubenswrapper[15202]: I0319 09:29:18.127190 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:29:18.204364 master-0 kubenswrapper[15202]: I0319 09:29:18.203620 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:29:18.225400 master-0 kubenswrapper[15202]: I0319 09:29:18.225349 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:29:18.244810 master-0 kubenswrapper[15202]: I0319 09:29:18.244743 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:29:18.290998 master-0 kubenswrapper[15202]: I0319 09:29:18.290836 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:29:18.331551 master-0 kubenswrapper[15202]: I0319 09:29:18.331437 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:29:18.389991 master-0 kubenswrapper[15202]: I0319 09:29:18.389924 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:29:18.675384 master-0 kubenswrapper[15202]: I0319 09:29:18.675210 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:29:18.818568 master-0 kubenswrapper[15202]: I0319 09:29:18.818519 15202 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:29:19.080974 master-0 kubenswrapper[15202]: I0319 09:29:19.080927 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:29:19.120991 master-0 kubenswrapper[15202]: I0319 09:29:19.120940 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:29:19.192795 master-0 kubenswrapper[15202]: I0319 09:29:19.192731 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:29:19.280818 master-0 kubenswrapper[15202]: I0319 09:29:19.280730 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-57xnh" Mar 19 09:29:19.592577 master-0 kubenswrapper[15202]: I0319 09:29:19.592387 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:29:19.874822 master-0 kubenswrapper[15202]: I0319 09:29:19.873784 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:29:19.918208 master-0 kubenswrapper[15202]: I0319 09:29:19.918124 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:29:20.054384 master-0 kubenswrapper[15202]: I0319 09:29:20.054304 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:29:20.315800 master-0 kubenswrapper[15202]: I0319 09:29:20.315738 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:29:20.368682 master-0 kubenswrapper[15202]: I0319 09:29:20.368605 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:29:20.388324 master-0 kubenswrapper[15202]: I0319 09:29:20.388252 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:29:20.401743 master-0 kubenswrapper[15202]: I0319 09:29:20.401644 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:29:20.416828 master-0 kubenswrapper[15202]: I0319 09:29:20.416735 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:20.432957 master-0 kubenswrapper[15202]: I0319 09:29:20.432890 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:29:20.439675 master-0 kubenswrapper[15202]: I0319 09:29:20.439608 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:29:20.478614 master-0 kubenswrapper[15202]: I0319 09:29:20.478536 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:29:20.602926 master-0 kubenswrapper[15202]: I0319 09:29:20.602082 15202 patch_prober.go:28] interesting pod/console-697d79fb97-jrvk4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" start-of-body= Mar 19 09:29:20.602926 master-0 kubenswrapper[15202]: I0319 09:29:20.602193 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" probeResult="failure" output="Get \"https://10.128.0.95:8443/health\": dial tcp 10.128.0.95:8443: connect: connection refused" Mar 19 09:29:20.603608 master-0 kubenswrapper[15202]: I0319 09:29:20.603511 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:29:20.626445 master-0 kubenswrapper[15202]: I0319 09:29:20.626374 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zxmm6" Mar 19 09:29:20.633353 master-0 kubenswrapper[15202]: I0319 09:29:20.633261 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-t6dfg" Mar 19 09:29:20.684236 master-0 kubenswrapper[15202]: I0319 09:29:20.684172 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:29:20.903010 master-0 kubenswrapper[15202]: I0319 09:29:20.902826 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:29:20.909399 master-0 kubenswrapper[15202]: I0319 09:29:20.909351 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:29:21.010623 master-0 kubenswrapper[15202]: I0319 09:29:21.010542 15202 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:29:21.011016 master-0 kubenswrapper[15202]: I0319 09:29:21.011003 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:29:21.096753 master-0 kubenswrapper[15202]: I0319 09:29:21.096689 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:29:21.185964 master-0 kubenswrapper[15202]: I0319 09:29:21.185899 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:29:21.213223 master-0 kubenswrapper[15202]: I0319 09:29:21.213156 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:29:21.216153 master-0 kubenswrapper[15202]: I0319 09:29:21.216100 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:29:21.320727 master-0 kubenswrapper[15202]: I0319 09:29:21.320658 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:29:21.361636 master-0 kubenswrapper[15202]: I0319 09:29:21.361535 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:29:21.412590 master-0 kubenswrapper[15202]: I0319 09:29:21.411109 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:29:21.415312 master-0 kubenswrapper[15202]: I0319 09:29:21.414158 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:29:21.467081 master-0 kubenswrapper[15202]: I0319 09:29:21.466967 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:29:21.475746 master-0 kubenswrapper[15202]: I0319 09:29:21.475486 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:29:21.478857 master-0 kubenswrapper[15202]: I0319 09:29:21.478810 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:29:21.494722 master-0 kubenswrapper[15202]: I0319 09:29:21.494545 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:29:21.551645 master-0 kubenswrapper[15202]: I0319 09:29:21.551575 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:29:21.570036 master-0 kubenswrapper[15202]: I0319 09:29:21.569535 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:29:21.570856 master-0 kubenswrapper[15202]: I0319 09:29:21.570756 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:29:21.572149 master-0 kubenswrapper[15202]: I0319 09:29:21.572102 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:29:21.574238 master-0 kubenswrapper[15202]: I0319 09:29:21.574155 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:29:21.580780 master-0 kubenswrapper[15202]: I0319 09:29:21.580061 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:29:21.649872 master-0 kubenswrapper[15202]: I0319 09:29:21.649812 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:29:21.656445 master-0 kubenswrapper[15202]: I0319 09:29:21.656394 15202 patch_prober.go:28] interesting pod/console-cdc9755cd-fl679 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Mar 19 09:29:21.656562 master-0 kubenswrapper[15202]: I0319 09:29:21.656460 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Mar 19 09:29:21.696558 master-0 kubenswrapper[15202]: I0319 09:29:21.696454 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:29:21.708456 master-0 kubenswrapper[15202]: I0319 09:29:21.708391 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:29:21.725674 master-0 kubenswrapper[15202]: I0319 09:29:21.725539 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:29:21.769204 master-0 kubenswrapper[15202]: I0319 09:29:21.769122 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-tgqwm" Mar 19 09:29:21.819607 master-0 kubenswrapper[15202]: I0319 09:29:21.817917 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:29:21.958835 master-0 kubenswrapper[15202]: I0319 09:29:21.958752 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:29:21.975196 master-0 kubenswrapper[15202]: I0319 09:29:21.975145 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:29:21.999045 master-0 kubenswrapper[15202]: I0319 09:29:21.998894 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:29:22.146180 master-0 kubenswrapper[15202]: I0319 09:29:22.146044 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:29:22.152315 master-0 kubenswrapper[15202]: I0319 09:29:22.152241 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:29:22.179954 master-0 kubenswrapper[15202]: I0319 09:29:22.179779 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:29:22.196595 master-0 kubenswrapper[15202]: I0319 09:29:22.193140 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:29:22.230342 master-0 kubenswrapper[15202]: I0319 09:29:22.230261 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:29:22.275370 master-0 kubenswrapper[15202]: I0319 09:29:22.275172 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:22.279411 master-0 kubenswrapper[15202]: I0319 09:29:22.279320 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:22.373426 master-0 kubenswrapper[15202]: I0319 09:29:22.373355 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:29:22.387869 master-0 kubenswrapper[15202]: I0319 09:29:22.387753 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:29:22.420668 master-0 kubenswrapper[15202]: I0319 09:29:22.420585 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:29:22.459184 master-0 kubenswrapper[15202]: I0319 09:29:22.459112 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:29:22.531534 master-0 kubenswrapper[15202]: I0319 09:29:22.531352 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:29:22.582361 master-0 kubenswrapper[15202]: I0319 09:29:22.582284 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:29:22.596995 master-0 kubenswrapper[15202]: I0319 09:29:22.596922 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:29:22.635436 master-0 kubenswrapper[15202]: I0319 09:29:22.635351 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:29:22.698309 master-0 kubenswrapper[15202]: I0319 09:29:22.698234 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:29:22.722670 master-0 kubenswrapper[15202]: I0319 09:29:22.722586 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:29:22.741549 master-0 kubenswrapper[15202]: I0319 09:29:22.741442 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:29:22.776222 master-0 kubenswrapper[15202]: I0319 09:29:22.776146 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-mdr74" Mar 19 09:29:22.786404 master-0 kubenswrapper[15202]: I0319 09:29:22.786257 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-r6z7f" Mar 19 09:29:22.827439 master-0 kubenswrapper[15202]: I0319 09:29:22.827386 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:29:22.904920 master-0 kubenswrapper[15202]: I0319 09:29:22.904849 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:29:23.138836 master-0 kubenswrapper[15202]: I0319 09:29:23.138713 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:29:23.176525 master-0 kubenswrapper[15202]: I0319 09:29:23.176449 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:29:23.248134 master-0 kubenswrapper[15202]: I0319 09:29:23.248070 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:29:23.323898 master-0 kubenswrapper[15202]: I0319 09:29:23.323837 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:29:23.350835 master-0 kubenswrapper[15202]: I0319 09:29:23.350770 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 09:29:23.372104 master-0 kubenswrapper[15202]: I0319 09:29:23.372041 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:29:23.384992 master-0 kubenswrapper[15202]: I0319 09:29:23.384933 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:29:23.415928 master-0 kubenswrapper[15202]: I0319 09:29:23.415802 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:29:23.439999 master-0 kubenswrapper[15202]: I0319 09:29:23.439922 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:29:23.610364 master-0 kubenswrapper[15202]: I0319 09:29:23.610275 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:29:23.645226 master-0 kubenswrapper[15202]: I0319 09:29:23.645150 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 09:29:23.760002 master-0 kubenswrapper[15202]: I0319 09:29:23.759937 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:29:23.787067 master-0 kubenswrapper[15202]: I0319 09:29:23.786979 15202 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:29:23.828536 master-0 kubenswrapper[15202]: I0319 09:29:23.825958 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:29:23.849260 master-0 kubenswrapper[15202]: I0319 09:29:23.849204 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:29:23.916610 master-0 kubenswrapper[15202]: I0319 09:29:23.915890 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:29:23.948515 master-0 kubenswrapper[15202]: I0319 09:29:23.947987 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:29:23.948515 master-0 kubenswrapper[15202]: I0319 09:29:23.948245 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:29:23.968780 master-0 kubenswrapper[15202]: I0319 09:29:23.960726 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:29:24.032965 master-0 kubenswrapper[15202]: I0319 09:29:24.032816 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:29:24.080255 master-0 kubenswrapper[15202]: I0319 09:29:24.080204 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 09:29:24.141842 master-0 kubenswrapper[15202]: I0319 09:29:24.141781 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:29:24.184570 master-0 kubenswrapper[15202]: I0319 09:29:24.184512 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xcbjl" Mar 19 09:29:24.209833 master-0 kubenswrapper[15202]: I0319 09:29:24.209780 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:29:24.283557 master-0 kubenswrapper[15202]: I0319 09:29:24.283358 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:29:24.298622 master-0 kubenswrapper[15202]: I0319 09:29:24.298553 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:29:24.312187 master-0 kubenswrapper[15202]: I0319 09:29:24.312135 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:29:24.409151 master-0 kubenswrapper[15202]: I0319 09:29:24.408830 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4i3vpe46p0rrq" Mar 19 09:29:24.426826 master-0 kubenswrapper[15202]: I0319 09:29:24.426593 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-lvzxr" Mar 19 09:29:24.442584 master-0 kubenswrapper[15202]: I0319 09:29:24.442367 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:29:24.475688 master-0 kubenswrapper[15202]: I0319 09:29:24.475418 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wkbj2" Mar 19 09:29:24.498952 master-0 kubenswrapper[15202]: I0319 09:29:24.498702 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:29:24.514023 master-0 kubenswrapper[15202]: I0319 09:29:24.513965 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:29:24.535821 master-0 kubenswrapper[15202]: I0319 09:29:24.535681 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:29:24.550284 master-0 kubenswrapper[15202]: I0319 09:29:24.550240 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:29:24.554059 master-0 kubenswrapper[15202]: I0319 09:29:24.554033 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:29:24.642868 master-0 kubenswrapper[15202]: I0319 09:29:24.642819 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:29:24.700663 master-0 kubenswrapper[15202]: I0319 09:29:24.700582 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5339u6k6jn3h3" Mar 19 09:29:24.709181 master-0 kubenswrapper[15202]: I0319 09:29:24.709128 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:29:24.713502 master-0 kubenswrapper[15202]: I0319 09:29:24.713424 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:29:24.752402 master-0 kubenswrapper[15202]: I0319 09:29:24.752352 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:29:24.769462 master-0 kubenswrapper[15202]: I0319 09:29:24.769415 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:29:24.791740 master-0 kubenswrapper[15202]: I0319 09:29:24.791581 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:29:24.868502 master-0 kubenswrapper[15202]: I0319 09:29:24.867327 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:29:25.011738 master-0 kubenswrapper[15202]: I0319 09:29:25.011679 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:29:25.014696 master-0 kubenswrapper[15202]: I0319 09:29:25.014663 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:29:25.024956 master-0 kubenswrapper[15202]: I0319 09:29:25.024912 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:29:25.163064 master-0 kubenswrapper[15202]: I0319 09:29:25.162939 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:29:25.308193 master-0 kubenswrapper[15202]: I0319 09:29:25.308119 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:29:25.342659 master-0 kubenswrapper[15202]: I0319 09:29:25.341030 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:29:25.342659 master-0 kubenswrapper[15202]: I0319 09:29:25.341241 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:29:25.365850 master-0 kubenswrapper[15202]: I0319 09:29:25.365780 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-j66zv" Mar 19 09:29:25.381862 master-0 kubenswrapper[15202]: I0319 09:29:25.381802 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:29:25.510957 master-0 kubenswrapper[15202]: I0319 09:29:25.510894 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:29:25.521592 master-0 kubenswrapper[15202]: I0319 09:29:25.521542 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:29:25.645192 master-0 kubenswrapper[15202]: I0319 09:29:25.645136 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:29:25.657797 master-0 kubenswrapper[15202]: I0319 09:29:25.657730 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:29:25.694516 master-0 kubenswrapper[15202]: I0319 09:29:25.694137 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:29:25.773072 master-0 kubenswrapper[15202]: I0319 09:29:25.772944 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:29:25.805283 master-0 kubenswrapper[15202]: I0319 09:29:25.805224 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:29:25.830934 master-0 kubenswrapper[15202]: I0319 09:29:25.830861 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:29:25.874212 master-0 kubenswrapper[15202]: I0319 09:29:25.874139 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:29:25.925334 master-0 kubenswrapper[15202]: I0319 09:29:25.925270 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:29:26.025875 master-0 kubenswrapper[15202]: I0319 09:29:26.025719 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:29:26.057538 master-0 kubenswrapper[15202]: I0319 09:29:26.057344 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:29:26.082027 master-0 kubenswrapper[15202]: I0319 09:29:26.081965 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:29:26.174933 master-0 kubenswrapper[15202]: I0319 09:29:26.174860 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:29:26.215492 master-0 kubenswrapper[15202]: I0319 09:29:26.215428 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:29:26.220258 master-0 kubenswrapper[15202]: I0319 09:29:26.220207 15202 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:29:26.221858 master-0 kubenswrapper[15202]: I0319 09:29:26.221781 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=41.221762618 podStartE2EDuration="41.221762618s" podCreationTimestamp="2026-03-19 09:28:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:29:10.524858485 +0000 UTC m=+267.910273341" watchObservedRunningTime="2026-03-19 09:29:26.221762618 +0000 UTC m=+283.607177434" Mar 19 09:29:26.225573 master-0 kubenswrapper[15202]: I0319 09:29:26.225516 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=44.225504908 podStartE2EDuration="44.225504908s" podCreationTimestamp="2026-03-19 09:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:29:10.178801995 +0000 UTC m=+267.564216871" watchObservedRunningTime="2026-03-19 09:29:26.225504908 +0000 UTC m=+283.610919724" Mar 19 09:29:26.226413 master-0 kubenswrapper[15202]: I0319 09:29:26.226382 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:29:26.226459 master-0 kubenswrapper[15202]: I0319 09:29:26.226421 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:29:26.231774 master-0 kubenswrapper[15202]: I0319 09:29:26.231590 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:29:26.258025 master-0 kubenswrapper[15202]: I0319 09:29:26.257891 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=16.25777032 podStartE2EDuration="16.25777032s" podCreationTimestamp="2026-03-19 09:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:29:26.251232432 +0000 UTC m=+283.636647268" watchObservedRunningTime="2026-03-19 09:29:26.25777032 +0000 UTC m=+283.643185136" Mar 19 09:29:26.342058 master-0 kubenswrapper[15202]: I0319 09:29:26.341920 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:29:26.358291 master-0 kubenswrapper[15202]: I0319 09:29:26.358242 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:29:26.369646 master-0 kubenswrapper[15202]: I0319 09:29:26.369578 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:29:26.377507 master-0 kubenswrapper[15202]: I0319 09:29:26.377426 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:29:26.380991 master-0 kubenswrapper[15202]: I0319 09:29:26.380937 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:29:26.392995 master-0 kubenswrapper[15202]: I0319 09:29:26.392937 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:29:26.394001 master-0 kubenswrapper[15202]: I0319 09:29:26.393955 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:29:26.454454 master-0 kubenswrapper[15202]: I0319 09:29:26.454335 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:29:26.498492 master-0 kubenswrapper[15202]: I0319 09:29:26.498412 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:29:26.537017 master-0 kubenswrapper[15202]: I0319 09:29:26.536929 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:29:26.586365 master-0 kubenswrapper[15202]: I0319 09:29:26.586313 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:29:26.623678 master-0 kubenswrapper[15202]: I0319 09:29:26.623562 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:29:26.826648 master-0 kubenswrapper[15202]: I0319 09:29:26.826578 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:29:26.827879 master-0 kubenswrapper[15202]: I0319 09:29:26.827836 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:29:26.837270 master-0 kubenswrapper[15202]: I0319 09:29:26.837215 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-l9t78" Mar 19 09:29:26.843510 master-0 kubenswrapper[15202]: I0319 09:29:26.843451 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:29:26.845871 master-0 kubenswrapper[15202]: I0319 09:29:26.845846 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:29:26.856074 master-0 kubenswrapper[15202]: I0319 09:29:26.855604 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:29:26.858006 master-0 kubenswrapper[15202]: I0319 09:29:26.857977 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:29:26.948495 master-0 kubenswrapper[15202]: I0319 09:29:26.948417 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:29:26.951625 master-0 kubenswrapper[15202]: I0319 09:29:26.951582 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:29:26.965417 master-0 kubenswrapper[15202]: I0319 09:29:26.965370 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:29:26.989028 master-0 kubenswrapper[15202]: I0319 09:29:26.988977 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:29:27.023422 master-0 kubenswrapper[15202]: I0319 09:29:27.023376 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:29:27.064899 master-0 kubenswrapper[15202]: I0319 09:29:27.064859 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:29:27.079184 master-0 kubenswrapper[15202]: I0319 09:29:27.079126 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:29:27.094606 master-0 kubenswrapper[15202]: I0319 09:29:27.094554 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:29:27.101933 master-0 kubenswrapper[15202]: I0319 09:29:27.101889 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:29:27.109456 master-0 kubenswrapper[15202]: I0319 09:29:27.109421 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:29:27.129763 master-0 kubenswrapper[15202]: I0319 09:29:27.129729 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:29:27.149967 master-0 kubenswrapper[15202]: I0319 09:29:27.149932 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:29:27.289792 master-0 kubenswrapper[15202]: I0319 09:29:27.289674 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:29:27.451594 master-0 kubenswrapper[15202]: I0319 09:29:27.451556 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:29:27.490342 master-0 kubenswrapper[15202]: I0319 09:29:27.490291 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:29:27.549797 master-0 kubenswrapper[15202]: I0319 09:29:27.549626 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:29:27.569388 master-0 kubenswrapper[15202]: I0319 09:29:27.569313 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:29:27.594764 master-0 kubenswrapper[15202]: I0319 09:29:27.594703 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:29:27.596683 master-0 kubenswrapper[15202]: I0319 09:29:27.596645 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 09:29:27.610590 master-0 kubenswrapper[15202]: I0319 09:29:27.610514 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:29:27.625818 master-0 kubenswrapper[15202]: I0319 09:29:27.625760 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:29:27.675220 master-0 kubenswrapper[15202]: I0319 09:29:27.675122 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:29:27.677058 master-0 kubenswrapper[15202]: I0319 09:29:27.677020 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:29:27.739600 master-0 kubenswrapper[15202]: I0319 09:29:27.739533 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:29:27.851845 master-0 kubenswrapper[15202]: I0319 09:29:27.851715 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:29:27.926228 master-0 kubenswrapper[15202]: I0319 09:29:27.926135 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:29:27.935606 master-0 kubenswrapper[15202]: I0319 09:29:27.935545 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:29:27.937180 master-0 kubenswrapper[15202]: I0319 09:29:27.937150 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:29:27.986260 master-0 kubenswrapper[15202]: I0319 09:29:27.986190 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:29:27.996279 master-0 kubenswrapper[15202]: I0319 09:29:27.996219 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-jtdpn" Mar 19 09:29:28.069730 master-0 kubenswrapper[15202]: I0319 09:29:28.069680 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:29:28.205831 master-0 kubenswrapper[15202]: I0319 09:29:28.205752 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jvr7z" Mar 19 09:29:28.218034 master-0 kubenswrapper[15202]: I0319 09:29:28.217438 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:29:28.229775 master-0 kubenswrapper[15202]: I0319 09:29:28.229724 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:29:28.284766 master-0 kubenswrapper[15202]: I0319 09:29:28.284702 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:29:28.323363 master-0 kubenswrapper[15202]: I0319 09:29:28.323287 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:29:28.327974 master-0 kubenswrapper[15202]: I0319 09:29:28.327926 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:29:28.384705 master-0 kubenswrapper[15202]: I0319 09:29:28.384644 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:29:28.408490 master-0 kubenswrapper[15202]: I0319 09:29:28.408413 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:29:28.473607 master-0 kubenswrapper[15202]: I0319 09:29:28.473461 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:29:28.538361 master-0 kubenswrapper[15202]: I0319 09:29:28.538271 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:29:28.576346 master-0 kubenswrapper[15202]: I0319 09:29:28.576270 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:29:28.581587 master-0 kubenswrapper[15202]: I0319 09:29:28.581554 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:29:28.601318 master-0 kubenswrapper[15202]: I0319 09:29:28.601267 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:29:28.637814 master-0 kubenswrapper[15202]: I0319 09:29:28.637723 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:29:28.688991 master-0 kubenswrapper[15202]: I0319 09:29:28.688931 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:29:28.718858 master-0 kubenswrapper[15202]: I0319 09:29:28.718794 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:29:28.733792 master-0 kubenswrapper[15202]: I0319 09:29:28.733679 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:29:28.756875 master-0 kubenswrapper[15202]: I0319 09:29:28.756832 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:29:28.788221 master-0 kubenswrapper[15202]: I0319 09:29:28.788169 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:29:28.799680 master-0 kubenswrapper[15202]: I0319 09:29:28.799605 15202 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:29:28.816602 master-0 kubenswrapper[15202]: I0319 09:29:28.816521 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:29:28.849300 master-0 kubenswrapper[15202]: I0319 09:29:28.848294 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:29:28.860289 master-0 kubenswrapper[15202]: I0319 09:29:28.858329 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-zsf7l" Mar 19 09:29:28.877166 master-0 kubenswrapper[15202]: I0319 09:29:28.876539 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:29:28.879814 master-0 kubenswrapper[15202]: I0319 09:29:28.879695 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-17lh7pj6890g7" Mar 19 09:29:28.925033 master-0 kubenswrapper[15202]: I0319 09:29:28.924840 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:29:28.948352 master-0 kubenswrapper[15202]: I0319 09:29:28.948109 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:29:28.990249 master-0 kubenswrapper[15202]: I0319 09:29:28.990135 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:29:28.999215 master-0 kubenswrapper[15202]: I0319 09:29:28.999184 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:29:29.032397 master-0 kubenswrapper[15202]: I0319 09:29:29.032327 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 19 09:29:29.032698 master-0 kubenswrapper[15202]: E0319 09:29:29.032677 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" containerName="installer" Mar 19 09:29:29.032698 master-0 kubenswrapper[15202]: I0319 09:29:29.032698 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" containerName="installer" Mar 19 09:29:29.032843 master-0 kubenswrapper[15202]: I0319 09:29:29.032828 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b617be-3cae-4fea-bba7-199de3e4ecf6" containerName="installer" Mar 19 09:29:29.033350 master-0 kubenswrapper[15202]: I0319 09:29:29.033333 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.035915 master-0 kubenswrapper[15202]: I0319 09:29:29.035885 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:29:29.036417 master-0 kubenswrapper[15202]: I0319 09:29:29.036389 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-jh786" Mar 19 09:29:29.037142 master-0 kubenswrapper[15202]: I0319 09:29:29.037117 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6s584" Mar 19 09:29:29.048832 master-0 kubenswrapper[15202]: I0319 09:29:29.048526 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:29:29.054576 master-0 kubenswrapper[15202]: I0319 09:29:29.053909 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 19 09:29:29.159129 master-0 kubenswrapper[15202]: I0319 09:29:29.159073 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:29:29.174858 master-0 kubenswrapper[15202]: I0319 09:29:29.174785 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:29:29.186030 master-0 kubenswrapper[15202]: I0319 09:29:29.185961 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.186126 master-0 kubenswrapper[15202]: I0319 09:29:29.186108 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.186193 master-0 kubenswrapper[15202]: I0319 09:29:29.186172 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.218780 master-0 kubenswrapper[15202]: I0319 09:29:29.218707 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:29:29.288001 master-0 kubenswrapper[15202]: I0319 09:29:29.287864 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.288001 master-0 kubenswrapper[15202]: I0319 09:29:29.287946 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.288001 master-0 kubenswrapper[15202]: I0319 09:29:29.287993 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.288307 master-0 kubenswrapper[15202]: I0319 09:29:29.288013 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.288307 master-0 kubenswrapper[15202]: I0319 09:29:29.288094 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.305725 master-0 kubenswrapper[15202]: I0319 09:29:29.305566 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.354259 master-0 kubenswrapper[15202]: I0319 09:29:29.354186 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:29:29.382678 master-0 kubenswrapper[15202]: I0319 09:29:29.382580 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:29:29.430861 master-0 kubenswrapper[15202]: I0319 09:29:29.430797 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:29:29.597268 master-0 kubenswrapper[15202]: I0319 09:29:29.596618 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-j956x" Mar 19 09:29:29.627859 master-0 kubenswrapper[15202]: I0319 09:29:29.627666 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:29:29.762969 master-0 kubenswrapper[15202]: I0319 09:29:29.762918 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:29:29.783549 master-0 kubenswrapper[15202]: I0319 09:29:29.780892 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-dtscf" Mar 19 09:29:29.793142 master-0 kubenswrapper[15202]: I0319 09:29:29.793063 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:29:29.809534 master-0 kubenswrapper[15202]: I0319 09:29:29.809289 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:29:29.809958 master-0 kubenswrapper[15202]: W0319 09:29:29.809902 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaef34f27_dcdc_4887_b4ec_a3e36f45a527.slice/crio-def0c4ab3edcf3f37dd78809b40b644c99371e8b209d6ae448a9388588196a3c WatchSource:0}: Error finding container def0c4ab3edcf3f37dd78809b40b644c99371e8b209d6ae448a9388588196a3c: Status 404 returned error can't find the container with id def0c4ab3edcf3f37dd78809b40b644c99371e8b209d6ae448a9388588196a3c Mar 19 09:29:29.815749 master-0 kubenswrapper[15202]: I0319 09:29:29.815693 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 19 09:29:29.879835 master-0 kubenswrapper[15202]: I0319 09:29:29.878939 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:29:29.879835 master-0 kubenswrapper[15202]: I0319 09:29:29.879152 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:29:29.911177 master-0 kubenswrapper[15202]: I0319 09:29:29.910083 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:29:29.998804 master-0 kubenswrapper[15202]: I0319 09:29:29.997199 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:29:30.013078 master-0 kubenswrapper[15202]: I0319 09:29:30.012982 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:29:30.130288 master-0 kubenswrapper[15202]: I0319 09:29:30.130199 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:29:30.247862 master-0 kubenswrapper[15202]: I0319 09:29:30.247610 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-2qc5w" Mar 19 09:29:30.311235 master-0 kubenswrapper[15202]: I0319 09:29:30.311160 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:29:30.421942 master-0 kubenswrapper[15202]: I0319 09:29:30.421885 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:29:30.471228 master-0 kubenswrapper[15202]: I0319 09:29:30.471159 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:29:30.497424 master-0 kubenswrapper[15202]: I0319 09:29:30.492463 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:29:30.530279 master-0 kubenswrapper[15202]: I0319 09:29:30.530208 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 09:29:30.558629 master-0 kubenswrapper[15202]: I0319 09:29:30.557887 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:29:30.561159 master-0 kubenswrapper[15202]: I0319 09:29:30.561117 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"aef34f27-dcdc-4887-b4ec-a3e36f45a527","Type":"ContainerStarted","Data":"f4a1769bb981c8867cb28c8bd8a9cd63256e35c808e96dfbbf95933c5511bc04"} Mar 19 09:29:30.561252 master-0 kubenswrapper[15202]: I0319 09:29:30.561163 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"aef34f27-dcdc-4887-b4ec-a3e36f45a527","Type":"ContainerStarted","Data":"def0c4ab3edcf3f37dd78809b40b644c99371e8b209d6ae448a9388588196a3c"} Mar 19 09:29:30.572051 master-0 kubenswrapper[15202]: I0319 09:29:30.571991 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vp2s5" Mar 19 09:29:30.582541 master-0 kubenswrapper[15202]: I0319 09:29:30.581237 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=1.581213669 podStartE2EDuration="1.581213669s" podCreationTimestamp="2026-03-19 09:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:29:30.579210971 +0000 UTC m=+287.964625787" watchObservedRunningTime="2026-03-19 09:29:30.581213669 +0000 UTC m=+287.966628485" Mar 19 09:29:30.611891 master-0 kubenswrapper[15202]: I0319 09:29:30.611211 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:29:30.621122 master-0 kubenswrapper[15202]: I0319 09:29:30.621079 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:29:30.650688 master-0 kubenswrapper[15202]: I0319 09:29:30.650556 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-bvdqs" Mar 19 09:29:30.654366 master-0 kubenswrapper[15202]: I0319 09:29:30.654335 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:29:30.669913 master-0 kubenswrapper[15202]: I0319 09:29:30.669848 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:29:30.768682 master-0 kubenswrapper[15202]: I0319 09:29:30.768603 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:29:30.856134 master-0 kubenswrapper[15202]: I0319 09:29:30.856067 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:29:30.883525 master-0 kubenswrapper[15202]: I0319 09:29:30.883424 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:29:30.889293 master-0 kubenswrapper[15202]: I0319 09:29:30.889219 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:29:30.910402 master-0 kubenswrapper[15202]: I0319 09:29:30.910236 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:29:30.982080 master-0 kubenswrapper[15202]: I0319 09:29:30.982010 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:29:31.003148 master-0 kubenswrapper[15202]: I0319 09:29:31.003063 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:29:31.023234 master-0 kubenswrapper[15202]: I0319 09:29:31.023154 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:29:31.103106 master-0 kubenswrapper[15202]: I0319 09:29:31.103046 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-llwk7" Mar 19 09:29:31.108216 master-0 kubenswrapper[15202]: I0319 09:29:31.108183 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:29:31.259315 master-0 kubenswrapper[15202]: I0319 09:29:31.259252 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:29:31.331753 master-0 kubenswrapper[15202]: I0319 09:29:31.331664 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:29:31.343720 master-0 kubenswrapper[15202]: I0319 09:29:31.343626 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:29:31.359197 master-0 kubenswrapper[15202]: I0319 09:29:31.359130 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:29:31.661496 master-0 kubenswrapper[15202]: I0319 09:29:31.661275 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:29:31.666139 master-0 kubenswrapper[15202]: I0319 09:29:31.666060 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:29:31.689813 master-0 kubenswrapper[15202]: I0319 09:29:31.689729 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 09:29:31.719334 master-0 kubenswrapper[15202]: I0319 09:29:31.719253 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:29:31.777656 master-0 kubenswrapper[15202]: I0319 09:29:31.777586 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:29:31.918937 master-0 kubenswrapper[15202]: I0319 09:29:31.918528 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:29:31.968891 master-0 kubenswrapper[15202]: I0319 09:29:31.968778 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:29:32.144539 master-0 kubenswrapper[15202]: I0319 09:29:32.144391 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:29:32.201356 master-0 kubenswrapper[15202]: I0319 09:29:32.201298 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:29:32.238307 master-0 kubenswrapper[15202]: I0319 09:29:32.238255 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:29:32.305411 master-0 kubenswrapper[15202]: I0319 09:29:32.305341 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:29:32.314358 master-0 kubenswrapper[15202]: I0319 09:29:32.314252 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:29:32.314883 master-0 kubenswrapper[15202]: I0319 09:29:32.314790 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" containerID="cri-o://28d21e26324d8f2be58c9073e73edf161f575e4263dc7b071b5e8f96cd46fdee" gracePeriod=5 Mar 19 09:29:32.333772 master-0 kubenswrapper[15202]: I0319 09:29:32.333704 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:29:32.344356 master-0 kubenswrapper[15202]: I0319 09:29:32.344307 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:29:32.479203 master-0 kubenswrapper[15202]: I0319 09:29:32.479041 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:29:32.505078 master-0 kubenswrapper[15202]: I0319 09:29:32.505019 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:29:32.547645 master-0 kubenswrapper[15202]: I0319 09:29:32.547527 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:29:32.560209 master-0 kubenswrapper[15202]: I0319 09:29:32.560149 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:29:32.617800 master-0 kubenswrapper[15202]: I0319 09:29:32.617725 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:29:32.618481 master-0 kubenswrapper[15202]: I0319 09:29:32.618427 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:29:32.649368 master-0 kubenswrapper[15202]: I0319 09:29:32.649296 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:29:32.654842 master-0 kubenswrapper[15202]: I0319 09:29:32.654782 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:29:32.677231 master-0 kubenswrapper[15202]: I0319 09:29:32.677155 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-jwt9n" Mar 19 09:29:32.704650 master-0 kubenswrapper[15202]: I0319 09:29:32.704563 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:29:32.795931 master-0 kubenswrapper[15202]: I0319 09:29:32.795714 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 09:29:32.824818 master-0 kubenswrapper[15202]: I0319 09:29:32.824767 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:29:32.883869 master-0 kubenswrapper[15202]: I0319 09:29:32.883740 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-pr68p" Mar 19 09:29:32.889575 master-0 kubenswrapper[15202]: I0319 09:29:32.889522 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:29:33.066893 master-0 kubenswrapper[15202]: I0319 09:29:33.066715 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-bgq5z" Mar 19 09:29:33.123038 master-0 kubenswrapper[15202]: I0319 09:29:33.122981 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:29:33.137419 master-0 kubenswrapper[15202]: I0319 09:29:33.137359 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:29:33.197940 master-0 kubenswrapper[15202]: I0319 09:29:33.197875 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:29:33.287150 master-0 kubenswrapper[15202]: I0319 09:29:33.287053 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:29:33.486335 master-0 kubenswrapper[15202]: I0319 09:29:33.486273 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:29:33.708954 master-0 kubenswrapper[15202]: I0319 09:29:33.708869 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:29:33.946818 master-0 kubenswrapper[15202]: I0319 09:29:33.944869 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xvfqf" Mar 19 09:29:33.979029 master-0 kubenswrapper[15202]: I0319 09:29:33.978978 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cdc9755cd-fl679"] Mar 19 09:29:34.044812 master-0 kubenswrapper[15202]: I0319 09:29:34.044733 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:29:34.617047 master-0 kubenswrapper[15202]: I0319 09:29:34.616977 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:29:36.461454 master-0 kubenswrapper[15202]: I0319 09:29:36.461150 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7988f8bb7-j9w48"] Mar 19 09:29:36.462098 master-0 kubenswrapper[15202]: E0319 09:29:36.461647 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 19 09:29:36.462098 master-0 kubenswrapper[15202]: I0319 09:29:36.461667 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 19 09:29:36.462098 master-0 kubenswrapper[15202]: I0319 09:29:36.461930 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 19 09:29:36.462750 master-0 kubenswrapper[15202]: I0319 09:29:36.462722 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.546406 master-0 kubenswrapper[15202]: I0319 09:29:36.546330 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7988f8bb7-j9w48"] Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566678 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-service-ca\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566790 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9flx\" (UniqueName: \"kubernetes.io/projected/5ba72c91-3222-4576-b61a-1138e693508c-kube-api-access-h9flx\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566817 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-serving-cert\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566834 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-oauth-serving-cert\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566867 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-trusted-ca-bundle\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566887 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-oauth-config\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.570751 master-0 kubenswrapper[15202]: I0319 09:29:36.566921 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-console-config\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.669589 master-0 kubenswrapper[15202]: I0319 09:29:36.669452 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-service-ca\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.669863 master-0 kubenswrapper[15202]: I0319 09:29:36.669668 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9flx\" (UniqueName: \"kubernetes.io/projected/5ba72c91-3222-4576-b61a-1138e693508c-kube-api-access-h9flx\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.669863 master-0 kubenswrapper[15202]: I0319 09:29:36.669714 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-serving-cert\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.669863 master-0 kubenswrapper[15202]: I0319 09:29:36.669751 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-oauth-serving-cert\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.669863 master-0 kubenswrapper[15202]: I0319 09:29:36.669795 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-trusted-ca-bundle\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.669863 master-0 kubenswrapper[15202]: I0319 09:29:36.669827 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-oauth-config\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.670018 master-0 kubenswrapper[15202]: I0319 09:29:36.669883 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-console-config\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.671704 master-0 kubenswrapper[15202]: I0319 09:29:36.671634 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-service-ca\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.672173 master-0 kubenswrapper[15202]: I0319 09:29:36.672130 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-trusted-ca-bundle\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.672767 master-0 kubenswrapper[15202]: I0319 09:29:36.672720 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-console-config\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.672824 master-0 kubenswrapper[15202]: I0319 09:29:36.672790 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-oauth-serving-cert\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.677582 master-0 kubenswrapper[15202]: I0319 09:29:36.677527 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-serving-cert\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.678745 master-0 kubenswrapper[15202]: I0319 09:29:36.678698 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-oauth-config\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.701083 master-0 kubenswrapper[15202]: I0319 09:29:36.701037 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9flx\" (UniqueName: \"kubernetes.io/projected/5ba72c91-3222-4576-b61a-1138e693508c-kube-api-access-h9flx\") pod \"console-7988f8bb7-j9w48\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:36.789731 master-0 kubenswrapper[15202]: I0319 09:29:36.789557 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:37.280725 master-0 kubenswrapper[15202]: I0319 09:29:37.280636 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7988f8bb7-j9w48"] Mar 19 09:29:37.284063 master-0 kubenswrapper[15202]: W0319 09:29:37.284003 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba72c91_3222_4576_b61a_1138e693508c.slice/crio-48f0e513fa7049ede31558465687d2f9534f116ef2503e2e44bf6ac258ab8b42 WatchSource:0}: Error finding container 48f0e513fa7049ede31558465687d2f9534f116ef2503e2e44bf6ac258ab8b42: Status 404 returned error can't find the container with id 48f0e513fa7049ede31558465687d2f9534f116ef2503e2e44bf6ac258ab8b42 Mar 19 09:29:37.635198 master-0 kubenswrapper[15202]: I0319 09:29:37.634867 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7988f8bb7-j9w48" event={"ID":"5ba72c91-3222-4576-b61a-1138e693508c","Type":"ContainerStarted","Data":"d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946"} Mar 19 09:29:37.636620 master-0 kubenswrapper[15202]: I0319 09:29:37.636379 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7988f8bb7-j9w48" event={"ID":"5ba72c91-3222-4576-b61a-1138e693508c","Type":"ContainerStarted","Data":"48f0e513fa7049ede31558465687d2f9534f116ef2503e2e44bf6ac258ab8b42"} Mar 19 09:29:37.637155 master-0 kubenswrapper[15202]: I0319 09:29:37.637114 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 19 09:29:37.637275 master-0 kubenswrapper[15202]: I0319 09:29:37.637159 15202 generic.go:334] "Generic (PLEG): container finished" podID="16fb4ea7f83036d9c6adf3454fc7e9db" containerID="28d21e26324d8f2be58c9073e73edf161f575e4263dc7b071b5e8f96cd46fdee" exitCode=137 Mar 19 09:29:37.662253 master-0 kubenswrapper[15202]: I0319 09:29:37.662162 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7988f8bb7-j9w48" podStartSLOduration=1.662135672 podStartE2EDuration="1.662135672s" podCreationTimestamp="2026-03-19 09:29:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:29:37.65712312 +0000 UTC m=+295.042537936" watchObservedRunningTime="2026-03-19 09:29:37.662135672 +0000 UTC m=+295.047550488" Mar 19 09:29:37.919461 master-0 kubenswrapper[15202]: I0319 09:29:37.919331 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 19 09:29:37.919461 master-0 kubenswrapper[15202]: I0319 09:29:37.919432 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:29:38.097454 master-0 kubenswrapper[15202]: I0319 09:29:38.097390 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 09:29:38.097701 master-0 kubenswrapper[15202]: I0319 09:29:38.097533 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 09:29:38.097701 master-0 kubenswrapper[15202]: I0319 09:29:38.097570 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 09:29:38.097701 master-0 kubenswrapper[15202]: I0319 09:29:38.097590 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 09:29:38.097812 master-0 kubenswrapper[15202]: I0319 09:29:38.097704 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log" (OuterVolumeSpecName: "var-log") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:38.097812 master-0 kubenswrapper[15202]: I0319 09:29:38.097749 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:38.097812 master-0 kubenswrapper[15202]: I0319 09:29:38.097788 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 09:29:38.097913 master-0 kubenswrapper[15202]: I0319 09:29:38.097823 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock" (OuterVolumeSpecName: "var-lock") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:38.097913 master-0 kubenswrapper[15202]: I0319 09:29:38.097843 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests" (OuterVolumeSpecName: "manifests") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:38.098286 master-0 kubenswrapper[15202]: I0319 09:29:38.098264 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:38.098335 master-0 kubenswrapper[15202]: I0319 09:29:38.098287 15202 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:38.098335 master-0 kubenswrapper[15202]: I0319 09:29:38.098298 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:38.098335 master-0 kubenswrapper[15202]: I0319 09:29:38.098310 15202 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:38.103338 master-0 kubenswrapper[15202]: I0319 09:29:38.103296 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:29:38.200069 master-0 kubenswrapper[15202]: I0319 09:29:38.200013 15202 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:38.647691 master-0 kubenswrapper[15202]: I0319 09:29:38.647425 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 19 09:29:38.647691 master-0 kubenswrapper[15202]: I0319 09:29:38.647613 15202 scope.go:117] "RemoveContainer" containerID="28d21e26324d8f2be58c9073e73edf161f575e4263dc7b071b5e8f96cd46fdee" Mar 19 09:29:38.648318 master-0 kubenswrapper[15202]: I0319 09:29:38.647695 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:29:38.832636 master-0 kubenswrapper[15202]: I0319 09:29:38.832559 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" path="/var/lib/kubelet/pods/16fb4ea7f83036d9c6adf3454fc7e9db/volumes" Mar 19 09:29:38.832905 master-0 kubenswrapper[15202]: I0319 09:29:38.832850 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 19 09:29:38.914111 master-0 kubenswrapper[15202]: I0319 09:29:38.913970 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:29:38.914111 master-0 kubenswrapper[15202]: I0319 09:29:38.914016 15202 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="a4ed02f5-b251-4a54-80a1-22594f9db79c" Mar 19 09:29:38.920774 master-0 kubenswrapper[15202]: I0319 09:29:38.920714 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:29:38.920774 master-0 kubenswrapper[15202]: I0319 09:29:38.920760 15202 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="a4ed02f5-b251-4a54-80a1-22594f9db79c" Mar 19 09:29:42.518651 master-0 kubenswrapper[15202]: I0319 09:29:42.518570 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:29:42.551714 master-0 kubenswrapper[15202]: I0319 09:29:42.551654 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:29:42.703724 master-0 kubenswrapper[15202]: I0319 09:29:42.703612 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:29:42.822456 master-0 kubenswrapper[15202]: I0319 09:29:42.822316 15202 kubelet.go:1505] "Image garbage collection succeeded" Mar 19 09:29:46.789920 master-0 kubenswrapper[15202]: I0319 09:29:46.789727 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:46.789920 master-0 kubenswrapper[15202]: I0319 09:29:46.789846 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:46.796304 master-0 kubenswrapper[15202]: I0319 09:29:46.796207 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:47.726495 master-0 kubenswrapper[15202]: I0319 09:29:47.726358 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:29:47.847055 master-0 kubenswrapper[15202]: I0319 09:29:47.846955 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-697d79fb97-jrvk4"] Mar 19 09:29:54.022013 master-0 kubenswrapper[15202]: I0319 09:29:54.021951 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 19 09:29:54.022938 master-0 kubenswrapper[15202]: I0319 09:29:54.022912 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.024696 master-0 kubenswrapper[15202]: I0319 09:29:54.024640 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wv2vd" Mar 19 09:29:54.025605 master-0 kubenswrapper[15202]: I0319 09:29:54.025559 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:29:54.053087 master-0 kubenswrapper[15202]: I0319 09:29:54.053030 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56655dac-d4e6-47d8-a143-25d1e27102c2-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.053282 master-0 kubenswrapper[15202]: I0319 09:29:54.053096 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.053282 master-0 kubenswrapper[15202]: I0319 09:29:54.053135 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.091404 master-0 kubenswrapper[15202]: I0319 09:29:54.091333 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 19 09:29:54.155268 master-0 kubenswrapper[15202]: I0319 09:29:54.155182 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56655dac-d4e6-47d8-a143-25d1e27102c2-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.155268 master-0 kubenswrapper[15202]: I0319 09:29:54.155259 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.155572 master-0 kubenswrapper[15202]: I0319 09:29:54.155312 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.155572 master-0 kubenswrapper[15202]: I0319 09:29:54.155455 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.155572 master-0 kubenswrapper[15202]: I0319 09:29:54.155520 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.172555 master-0 kubenswrapper[15202]: I0319 09:29:54.172497 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56655dac-d4e6-47d8-a143-25d1e27102c2-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.341776 master-0 kubenswrapper[15202]: I0319 09:29:54.341634 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:29:54.770055 master-0 kubenswrapper[15202]: I0319 09:29:54.770006 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 19 09:29:54.870530 master-0 kubenswrapper[15202]: I0319 09:29:54.869623 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"56655dac-d4e6-47d8-a143-25d1e27102c2","Type":"ContainerStarted","Data":"86618ed9204ff2f9a9c022c4697e952722ccef95392aa3786a84ebe71dbefe40"} Mar 19 09:29:55.878569 master-0 kubenswrapper[15202]: I0319 09:29:55.878507 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"56655dac-d4e6-47d8-a143-25d1e27102c2","Type":"ContainerStarted","Data":"c11deef3040052065cfaa18854714453aee0c2b75383da3b07acc28a813e9ee0"} Mar 19 09:29:55.944721 master-0 kubenswrapper[15202]: I0319 09:29:55.944629 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" podStartSLOduration=1.944605806 podStartE2EDuration="1.944605806s" podCreationTimestamp="2026-03-19 09:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:29:55.94353292 +0000 UTC m=+313.328947736" watchObservedRunningTime="2026-03-19 09:29:55.944605806 +0000 UTC m=+313.330020632" Mar 19 09:29:59.020277 master-0 kubenswrapper[15202]: I0319 09:29:59.020158 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-cdc9755cd-fl679" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" containerID="cri-o://a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911" gracePeriod=15 Mar 19 09:29:59.573208 master-0 kubenswrapper[15202]: I0319 09:29:59.573156 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cdc9755cd-fl679_46339f4c-f550-4303-b237-4014572b69c1/console/0.log" Mar 19 09:29:59.573456 master-0 kubenswrapper[15202]: I0319 09:29:59.573239 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:29:59.654346 master-0 kubenswrapper[15202]: I0319 09:29:59.654180 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-console-config\") pod \"46339f4c-f550-4303-b237-4014572b69c1\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " Mar 19 09:29:59.654346 master-0 kubenswrapper[15202]: I0319 09:29:59.654246 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-service-ca\") pod \"46339f4c-f550-4303-b237-4014572b69c1\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " Mar 19 09:29:59.654346 master-0 kubenswrapper[15202]: I0319 09:29:59.654294 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtnfz\" (UniqueName: \"kubernetes.io/projected/46339f4c-f550-4303-b237-4014572b69c1-kube-api-access-dtnfz\") pod \"46339f4c-f550-4303-b237-4014572b69c1\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " Mar 19 09:29:59.654688 master-0 kubenswrapper[15202]: I0319 09:29:59.654397 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-serving-cert\") pod \"46339f4c-f550-4303-b237-4014572b69c1\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " Mar 19 09:29:59.654688 master-0 kubenswrapper[15202]: I0319 09:29:59.654447 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-oauth-serving-cert\") pod \"46339f4c-f550-4303-b237-4014572b69c1\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " Mar 19 09:29:59.654688 master-0 kubenswrapper[15202]: I0319 09:29:59.654500 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-oauth-config\") pod \"46339f4c-f550-4303-b237-4014572b69c1\" (UID: \"46339f4c-f550-4303-b237-4014572b69c1\") " Mar 19 09:29:59.654806 master-0 kubenswrapper[15202]: I0319 09:29:59.654789 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-console-config" (OuterVolumeSpecName: "console-config") pod "46339f4c-f550-4303-b237-4014572b69c1" (UID: "46339f4c-f550-4303-b237-4014572b69c1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:59.655017 master-0 kubenswrapper[15202]: I0319 09:29:59.654959 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-service-ca" (OuterVolumeSpecName: "service-ca") pod "46339f4c-f550-4303-b237-4014572b69c1" (UID: "46339f4c-f550-4303-b237-4014572b69c1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:59.655214 master-0 kubenswrapper[15202]: I0319 09:29:59.655144 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "46339f4c-f550-4303-b237-4014572b69c1" (UID: "46339f4c-f550-4303-b237-4014572b69c1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:29:59.659088 master-0 kubenswrapper[15202]: I0319 09:29:59.659042 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46339f4c-f550-4303-b237-4014572b69c1-kube-api-access-dtnfz" (OuterVolumeSpecName: "kube-api-access-dtnfz") pod "46339f4c-f550-4303-b237-4014572b69c1" (UID: "46339f4c-f550-4303-b237-4014572b69c1"). InnerVolumeSpecName "kube-api-access-dtnfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:29:59.660734 master-0 kubenswrapper[15202]: I0319 09:29:59.660690 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "46339f4c-f550-4303-b237-4014572b69c1" (UID: "46339f4c-f550-4303-b237-4014572b69c1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:59.663400 master-0 kubenswrapper[15202]: I0319 09:29:59.660977 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "46339f4c-f550-4303-b237-4014572b69c1" (UID: "46339f4c-f550-4303-b237-4014572b69c1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:29:59.756308 master-0 kubenswrapper[15202]: I0319 09:29:59.756255 15202 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:59.756673 master-0 kubenswrapper[15202]: I0319 09:29:59.756654 15202 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:59.756784 master-0 kubenswrapper[15202]: I0319 09:29:59.756767 15202 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:59.756869 master-0 kubenswrapper[15202]: I0319 09:29:59.756855 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtnfz\" (UniqueName: \"kubernetes.io/projected/46339f4c-f550-4303-b237-4014572b69c1-kube-api-access-dtnfz\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:59.756970 master-0 kubenswrapper[15202]: I0319 09:29:59.756956 15202 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46339f4c-f550-4303-b237-4014572b69c1-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:59.757052 master-0 kubenswrapper[15202]: I0319 09:29:59.757039 15202 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46339f4c-f550-4303-b237-4014572b69c1-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:29:59.895742 master-0 kubenswrapper[15202]: I0319 09:29:59.895669 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c75dc494b-tvf5c"] Mar 19 09:29:59.896073 master-0 kubenswrapper[15202]: E0319 09:29:59.896014 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" Mar 19 09:29:59.896073 master-0 kubenswrapper[15202]: I0319 09:29:59.896028 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" Mar 19 09:29:59.896283 master-0 kubenswrapper[15202]: I0319 09:29:59.896200 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="46339f4c-f550-4303-b237-4014572b69c1" containerName="console" Mar 19 09:29:59.896762 master-0 kubenswrapper[15202]: I0319 09:29:59.896735 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:29:59.919219 master-0 kubenswrapper[15202]: I0319 09:29:59.919097 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cdc9755cd-fl679_46339f4c-f550-4303-b237-4014572b69c1/console/0.log" Mar 19 09:29:59.919219 master-0 kubenswrapper[15202]: I0319 09:29:59.919166 15202 generic.go:334] "Generic (PLEG): container finished" podID="46339f4c-f550-4303-b237-4014572b69c1" containerID="a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911" exitCode=2 Mar 19 09:29:59.919219 master-0 kubenswrapper[15202]: I0319 09:29:59.919206 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdc9755cd-fl679" event={"ID":"46339f4c-f550-4303-b237-4014572b69c1","Type":"ContainerDied","Data":"a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911"} Mar 19 09:29:59.919570 master-0 kubenswrapper[15202]: I0319 09:29:59.919243 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdc9755cd-fl679" event={"ID":"46339f4c-f550-4303-b237-4014572b69c1","Type":"ContainerDied","Data":"ddd0d08da1a19bd9635f3f69eb963eb666004399b604865aa340a8d2bd1f9bfb"} Mar 19 09:29:59.919570 master-0 kubenswrapper[15202]: I0319 09:29:59.919280 15202 scope.go:117] "RemoveContainer" containerID="a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911" Mar 19 09:29:59.919570 master-0 kubenswrapper[15202]: I0319 09:29:59.919488 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdc9755cd-fl679" Mar 19 09:29:59.941137 master-0 kubenswrapper[15202]: I0319 09:29:59.940648 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c75dc494b-tvf5c"] Mar 19 09:29:59.962913 master-0 kubenswrapper[15202]: I0319 09:29:59.962867 15202 scope.go:117] "RemoveContainer" containerID="a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911" Mar 19 09:29:59.963573 master-0 kubenswrapper[15202]: E0319 09:29:59.963534 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911\": container with ID starting with a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911 not found: ID does not exist" containerID="a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911" Mar 19 09:29:59.963619 master-0 kubenswrapper[15202]: I0319 09:29:59.963577 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911"} err="failed to get container status \"a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911\": rpc error: code = NotFound desc = could not find container \"a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911\": container with ID starting with a2e273cd578610d0ae26e29f2d8e98384f7102fca76730c4f977e8734e38d911 not found: ID does not exist" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063601 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4579z\" (UniqueName: \"kubernetes.io/projected/aa2134ce-a525-4d94-95ac-bd66b0117834-kube-api-access-4579z\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063677 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-serving-cert\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063708 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-service-ca\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063726 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-oauth-config\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063768 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-oauth-serving-cert\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063793 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-console-config\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.064492 master-0 kubenswrapper[15202]: I0319 09:30:00.063828 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-trusted-ca-bundle\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.089497 master-0 kubenswrapper[15202]: I0319 09:30:00.089039 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cdc9755cd-fl679"] Mar 19 09:30:00.113112 master-0 kubenswrapper[15202]: I0319 09:30:00.113058 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cdc9755cd-fl679"] Mar 19 09:30:00.165825 master-0 kubenswrapper[15202]: I0319 09:30:00.165738 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4579z\" (UniqueName: \"kubernetes.io/projected/aa2134ce-a525-4d94-95ac-bd66b0117834-kube-api-access-4579z\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.166168 master-0 kubenswrapper[15202]: I0319 09:30:00.165842 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-serving-cert\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.166168 master-0 kubenswrapper[15202]: I0319 09:30:00.165972 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-service-ca\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.166168 master-0 kubenswrapper[15202]: I0319 09:30:00.165998 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-oauth-config\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.166326 master-0 kubenswrapper[15202]: I0319 09:30:00.166177 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-oauth-serving-cert\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.167180 master-0 kubenswrapper[15202]: I0319 09:30:00.167141 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-service-ca\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.167265 master-0 kubenswrapper[15202]: I0319 09:30:00.167251 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-console-config\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.167432 master-0 kubenswrapper[15202]: I0319 09:30:00.167375 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-oauth-serving-cert\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.167519 master-0 kubenswrapper[15202]: I0319 09:30:00.167401 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-trusted-ca-bundle\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.168344 master-0 kubenswrapper[15202]: I0319 09:30:00.168288 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-console-config\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.168575 master-0 kubenswrapper[15202]: I0319 09:30:00.168548 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-trusted-ca-bundle\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.169338 master-0 kubenswrapper[15202]: I0319 09:30:00.169283 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-serving-cert\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.170667 master-0 kubenswrapper[15202]: I0319 09:30:00.170635 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-oauth-config\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.209388 master-0 kubenswrapper[15202]: I0319 09:30:00.209326 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4579z\" (UniqueName: \"kubernetes.io/projected/aa2134ce-a525-4d94-95ac-bd66b0117834-kube-api-access-4579z\") pod \"console-c75dc494b-tvf5c\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.235781 master-0 kubenswrapper[15202]: I0319 09:30:00.235719 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:00.822242 master-0 kubenswrapper[15202]: I0319 09:30:00.822190 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46339f4c-f550-4303-b237-4014572b69c1" path="/var/lib/kubelet/pods/46339f4c-f550-4303-b237-4014572b69c1/volumes" Mar 19 09:30:00.827725 master-0 kubenswrapper[15202]: W0319 09:30:00.827677 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2134ce_a525_4d94_95ac_bd66b0117834.slice/crio-8eeabd74c43ce3e703e2735198282c38bc27c954f606b6affb6b2e1f1718b268 WatchSource:0}: Error finding container 8eeabd74c43ce3e703e2735198282c38bc27c954f606b6affb6b2e1f1718b268: Status 404 returned error can't find the container with id 8eeabd74c43ce3e703e2735198282c38bc27c954f606b6affb6b2e1f1718b268 Mar 19 09:30:00.827984 master-0 kubenswrapper[15202]: I0319 09:30:00.827961 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c75dc494b-tvf5c"] Mar 19 09:30:00.934878 master-0 kubenswrapper[15202]: I0319 09:30:00.933867 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c75dc494b-tvf5c" event={"ID":"aa2134ce-a525-4d94-95ac-bd66b0117834","Type":"ContainerStarted","Data":"8eeabd74c43ce3e703e2735198282c38bc27c954f606b6affb6b2e1f1718b268"} Mar 19 09:30:01.351940 master-0 kubenswrapper[15202]: I0319 09:30:01.351787 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:30:01.352758 master-0 kubenswrapper[15202]: I0319 09:30:01.352734 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:30:01.353136 master-0 kubenswrapper[15202]: I0319 09:30:01.353097 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" containerID="cri-o://5dc0d234726aa7e09b7e006830783386358cf8a6aeab0626a8eb7cfa7413be9f" gracePeriod=30 Mar 19 09:30:01.353638 master-0 kubenswrapper[15202]: E0319 09:30:01.353573 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353638 master-0 kubenswrapper[15202]: I0319 09:30:01.353630 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353638 master-0 kubenswrapper[15202]: E0319 09:30:01.353647 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353831 master-0 kubenswrapper[15202]: I0319 09:30:01.353655 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353831 master-0 kubenswrapper[15202]: E0319 09:30:01.353681 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353831 master-0 kubenswrapper[15202]: I0319 09:30:01.353691 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353952 master-0 kubenswrapper[15202]: I0319 09:30:01.353914 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.353952 master-0 kubenswrapper[15202]: I0319 09:30:01.353938 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.354440 master-0 kubenswrapper[15202]: I0319 09:30:01.353970 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.354440 master-0 kubenswrapper[15202]: E0319 09:30:01.354157 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.354440 master-0 kubenswrapper[15202]: I0319 09:30:01.354170 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.354440 master-0 kubenswrapper[15202]: I0319 09:30:01.354348 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 09:30:01.356045 master-0 kubenswrapper[15202]: I0319 09:30:01.355701 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.403305 master-0 kubenswrapper[15202]: I0319 09:30:01.403210 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.403571 master-0 kubenswrapper[15202]: I0319 09:30:01.403321 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.505019 master-0 kubenswrapper[15202]: I0319 09:30:01.504801 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.505019 master-0 kubenswrapper[15202]: I0319 09:30:01.504872 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.505019 master-0 kubenswrapper[15202]: I0319 09:30:01.504975 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.505019 master-0 kubenswrapper[15202]: I0319 09:30:01.504991 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.752683 master-0 kubenswrapper[15202]: I0319 09:30:01.752620 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c75dc494b-tvf5c"] Mar 19 09:30:01.754237 master-0 kubenswrapper[15202]: I0319 09:30:01.754177 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:01.764020 master-0 kubenswrapper[15202]: I0319 09:30:01.763431 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:30:01.944162 master-0 kubenswrapper[15202]: I0319 09:30:01.944102 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c75dc494b-tvf5c" event={"ID":"aa2134ce-a525-4d94-95ac-bd66b0117834","Type":"ContainerStarted","Data":"a1a3e5487a96cd5d9248a53f9af7002e6dfe2f953175099c22fd9212899644d8"} Mar 19 09:30:01.945810 master-0 kubenswrapper[15202]: I0319 09:30:01.945724 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"08719c5c4bc464bfab200198ce996d7bf676c3a0e7b95ebda2aa0775bb8349fe"} Mar 19 09:30:01.947889 master-0 kubenswrapper[15202]: I0319 09:30:01.947848 15202 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="5dc0d234726aa7e09b7e006830783386358cf8a6aeab0626a8eb7cfa7413be9f" exitCode=0 Mar 19 09:30:01.947988 master-0 kubenswrapper[15202]: I0319 09:30:01.947930 15202 scope.go:117] "RemoveContainer" containerID="92a2db24929eebeb86c10e4da2210d08ce4c067d7696a9c259054e240344e6fa" Mar 19 09:30:01.949812 master-0 kubenswrapper[15202]: I0319 09:30:01.949757 15202 generic.go:334] "Generic (PLEG): container finished" podID="aef34f27-dcdc-4887-b4ec-a3e36f45a527" containerID="f4a1769bb981c8867cb28c8bd8a9cd63256e35c808e96dfbbf95933c5511bc04" exitCode=0 Mar 19 09:30:01.949880 master-0 kubenswrapper[15202]: I0319 09:30:01.949841 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"aef34f27-dcdc-4887-b4ec-a3e36f45a527","Type":"ContainerDied","Data":"f4a1769bb981c8867cb28c8bd8a9cd63256e35c808e96dfbbf95933c5511bc04"} Mar 19 09:30:02.114115 master-0 kubenswrapper[15202]: I0319 09:30:02.114023 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:30:02.215810 master-0 kubenswrapper[15202]: I0319 09:30:02.215745 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 09:30:02.216020 master-0 kubenswrapper[15202]: I0319 09:30:02.215883 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets" (OuterVolumeSpecName: "secrets") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:02.216020 master-0 kubenswrapper[15202]: I0319 09:30:02.215919 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 09:30:02.216165 master-0 kubenswrapper[15202]: I0319 09:30:02.216044 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs" (OuterVolumeSpecName: "logs") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:02.216740 master-0 kubenswrapper[15202]: I0319 09:30:02.216703 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:02.216740 master-0 kubenswrapper[15202]: I0319 09:30:02.216735 15202 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:02.359795 master-0 kubenswrapper[15202]: I0319 09:30:02.359585 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c75dc494b-tvf5c" podStartSLOduration=3.359566075 podStartE2EDuration="3.359566075s" podCreationTimestamp="2026-03-19 09:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:02.354726427 +0000 UTC m=+319.740141243" watchObservedRunningTime="2026-03-19 09:30:02.359566075 +0000 UTC m=+319.744980891" Mar 19 09:30:02.649121 master-0 kubenswrapper[15202]: I0319 09:30:02.648933 15202 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="b1d58ec9-90ca-45ca-b3a8-47e4be740512" Mar 19 09:30:02.821345 master-0 kubenswrapper[15202]: I0319 09:30:02.821286 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83737980b9ee109184b1d78e942cf36" path="/var/lib/kubelet/pods/c83737980b9ee109184b1d78e942cf36/volumes" Mar 19 09:30:02.821648 master-0 kubenswrapper[15202]: I0319 09:30:02.821625 15202 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 19 09:30:02.957713 master-0 kubenswrapper[15202]: I0319 09:30:02.957654 15202 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc" exitCode=0 Mar 19 09:30:02.961070 master-0 kubenswrapper[15202]: I0319 09:30:02.961037 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 09:30:03.194154 master-0 kubenswrapper[15202]: I0319 09:30:03.194090 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc"} Mar 19 09:30:03.194154 master-0 kubenswrapper[15202]: I0319 09:30:03.194145 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:30:03.194154 master-0 kubenswrapper[15202]: I0319 09:30:03.194159 15202 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="b1d58ec9-90ca-45ca-b3a8-47e4be740512" Mar 19 09:30:03.194504 master-0 kubenswrapper[15202]: I0319 09:30:03.194185 15202 scope.go:117] "RemoveContainer" containerID="5dc0d234726aa7e09b7e006830783386358cf8a6aeab0626a8eb7cfa7413be9f" Mar 19 09:30:03.199914 master-0 kubenswrapper[15202]: I0319 09:30:03.199848 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 09:30:03.199993 master-0 kubenswrapper[15202]: I0319 09:30:03.199911 15202 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="b1d58ec9-90ca-45ca-b3a8-47e4be740512" Mar 19 09:30:03.440087 master-0 kubenswrapper[15202]: I0319 09:30:03.440039 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:30:03.551526 master-0 kubenswrapper[15202]: I0319 09:30:03.551461 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kube-api-access\") pod \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " Mar 19 09:30:03.551769 master-0 kubenswrapper[15202]: I0319 09:30:03.551737 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-var-lock\") pod \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " Mar 19 09:30:03.551838 master-0 kubenswrapper[15202]: I0319 09:30:03.551798 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kubelet-dir\") pod \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\" (UID: \"aef34f27-dcdc-4887-b4ec-a3e36f45a527\") " Mar 19 09:30:03.551964 master-0 kubenswrapper[15202]: I0319 09:30:03.551789 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-var-lock" (OuterVolumeSpecName: "var-lock") pod "aef34f27-dcdc-4887-b4ec-a3e36f45a527" (UID: "aef34f27-dcdc-4887-b4ec-a3e36f45a527"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:03.551964 master-0 kubenswrapper[15202]: I0319 09:30:03.551855 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aef34f27-dcdc-4887-b4ec-a3e36f45a527" (UID: "aef34f27-dcdc-4887-b4ec-a3e36f45a527"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:03.552235 master-0 kubenswrapper[15202]: I0319 09:30:03.552205 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:03.552235 master-0 kubenswrapper[15202]: I0319 09:30:03.552229 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:03.557709 master-0 kubenswrapper[15202]: I0319 09:30:03.557662 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aef34f27-dcdc-4887-b4ec-a3e36f45a527" (UID: "aef34f27-dcdc-4887-b4ec-a3e36f45a527"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:03.654686 master-0 kubenswrapper[15202]: I0319 09:30:03.654627 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aef34f27-dcdc-4887-b4ec-a3e36f45a527-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:03.981360 master-0 kubenswrapper[15202]: I0319 09:30:03.981269 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"aef34f27-dcdc-4887-b4ec-a3e36f45a527","Type":"ContainerDied","Data":"def0c4ab3edcf3f37dd78809b40b644c99371e8b209d6ae448a9388588196a3c"} Mar 19 09:30:03.981360 master-0 kubenswrapper[15202]: I0319 09:30:03.981322 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def0c4ab3edcf3f37dd78809b40b644c99371e8b209d6ae448a9388588196a3c" Mar 19 09:30:03.981747 master-0 kubenswrapper[15202]: I0319 09:30:03.981393 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 19 09:30:03.987451 master-0 kubenswrapper[15202]: I0319 09:30:03.987347 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914"} Mar 19 09:30:03.987451 master-0 kubenswrapper[15202]: I0319 09:30:03.987410 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a"} Mar 19 09:30:04.256748 master-0 kubenswrapper[15202]: I0319 09:30:04.256671 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54cf565479-phtrp"] Mar 19 09:30:04.257065 master-0 kubenswrapper[15202]: E0319 09:30:04.257034 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef34f27-dcdc-4887-b4ec-a3e36f45a527" containerName="installer" Mar 19 09:30:04.257065 master-0 kubenswrapper[15202]: I0319 09:30:04.257056 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef34f27-dcdc-4887-b4ec-a3e36f45a527" containerName="installer" Mar 19 09:30:04.257423 master-0 kubenswrapper[15202]: I0319 09:30:04.257397 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef34f27-dcdc-4887-b4ec-a3e36f45a527" containerName="installer" Mar 19 09:30:04.258499 master-0 kubenswrapper[15202]: I0319 09:30:04.258439 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.272698 master-0 kubenswrapper[15202]: I0319 09:30:04.272639 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54cf565479-phtrp"] Mar 19 09:30:04.368182 master-0 kubenswrapper[15202]: I0319 09:30:04.368018 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-service-ca\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.368182 master-0 kubenswrapper[15202]: I0319 09:30:04.368101 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-trusted-ca-bundle\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.368182 master-0 kubenswrapper[15202]: I0319 09:30:04.368139 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-oauth-serving-cert\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.368182 master-0 kubenswrapper[15202]: I0319 09:30:04.368181 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbxf\" (UniqueName: \"kubernetes.io/projected/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-kube-api-access-jcbxf\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.368650 master-0 kubenswrapper[15202]: I0319 09:30:04.368243 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-config\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.368650 master-0 kubenswrapper[15202]: I0319 09:30:04.368282 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-serving-cert\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.368650 master-0 kubenswrapper[15202]: I0319 09:30:04.368311 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-oauth-config\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.469888 master-0 kubenswrapper[15202]: I0319 09:30:04.469806 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbxf\" (UniqueName: \"kubernetes.io/projected/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-kube-api-access-jcbxf\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.470658 master-0 kubenswrapper[15202]: I0319 09:30:04.469946 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-config\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.471033 master-0 kubenswrapper[15202]: I0319 09:30:04.470978 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-config\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.471132 master-0 kubenswrapper[15202]: I0319 09:30:04.471079 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-serving-cert\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.471681 master-0 kubenswrapper[15202]: I0319 09:30:04.471654 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-oauth-config\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.471773 master-0 kubenswrapper[15202]: I0319 09:30:04.471716 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-service-ca\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.471824 master-0 kubenswrapper[15202]: I0319 09:30:04.471787 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-trusted-ca-bundle\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.472234 master-0 kubenswrapper[15202]: I0319 09:30:04.472156 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-oauth-serving-cert\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.472915 master-0 kubenswrapper[15202]: I0319 09:30:04.472871 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-service-ca\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.473317 master-0 kubenswrapper[15202]: I0319 09:30:04.473283 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-trusted-ca-bundle\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.474126 master-0 kubenswrapper[15202]: I0319 09:30:04.474075 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-oauth-serving-cert\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.474861 master-0 kubenswrapper[15202]: I0319 09:30:04.474814 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-serving-cert\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.475938 master-0 kubenswrapper[15202]: I0319 09:30:04.475664 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-oauth-config\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.573979 master-0 kubenswrapper[15202]: I0319 09:30:04.573923 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbxf\" (UniqueName: \"kubernetes.io/projected/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-kube-api-access-jcbxf\") pod \"console-54cf565479-phtrp\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:04.577316 master-0 kubenswrapper[15202]: I0319 09:30:04.577274 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:05.002445 master-0 kubenswrapper[15202]: I0319 09:30:05.002317 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104"} Mar 19 09:30:05.002687 master-0 kubenswrapper[15202]: I0319 09:30:05.002502 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:05.029065 master-0 kubenswrapper[15202]: I0319 09:30:05.028877 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=4.028842549 podStartE2EDuration="4.028842549s" podCreationTimestamp="2026-03-19 09:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:05.02310166 +0000 UTC m=+322.408516496" watchObservedRunningTime="2026-03-19 09:30:05.028842549 +0000 UTC m=+322.414257365" Mar 19 09:30:05.054712 master-0 kubenswrapper[15202]: I0319 09:30:05.054656 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54cf565479-phtrp"] Mar 19 09:30:05.056746 master-0 kubenswrapper[15202]: W0319 09:30:05.056701 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00e8ef85_6a94_43c0_bc66_d23d4094eb8a.slice/crio-c8a8ec04d0a38482ac5ca985e52552c04c68d6f99fd83ddb6fb415395b20d70c WatchSource:0}: Error finding container c8a8ec04d0a38482ac5ca985e52552c04c68d6f99fd83ddb6fb415395b20d70c: Status 404 returned error can't find the container with id c8a8ec04d0a38482ac5ca985e52552c04c68d6f99fd83ddb6fb415395b20d70c Mar 19 09:30:06.014341 master-0 kubenswrapper[15202]: I0319 09:30:06.014244 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54cf565479-phtrp" event={"ID":"00e8ef85-6a94-43c0-bc66-d23d4094eb8a","Type":"ContainerStarted","Data":"b3efc8bceb0cac8b0e654e7e6b0770723ce3fb21a12b990155ed74274670b830"} Mar 19 09:30:06.014341 master-0 kubenswrapper[15202]: I0319 09:30:06.014336 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54cf565479-phtrp" event={"ID":"00e8ef85-6a94-43c0-bc66-d23d4094eb8a","Type":"ContainerStarted","Data":"c8a8ec04d0a38482ac5ca985e52552c04c68d6f99fd83ddb6fb415395b20d70c"} Mar 19 09:30:10.236777 master-0 kubenswrapper[15202]: I0319 09:30:10.236664 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:10.465240 master-0 kubenswrapper[15202]: I0319 09:30:10.465132 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54cf565479-phtrp" podStartSLOduration=9.465107568 podStartE2EDuration="9.465107568s" podCreationTimestamp="2026-03-19 09:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:06.043059699 +0000 UTC m=+323.428474525" watchObservedRunningTime="2026-03-19 09:30:10.465107568 +0000 UTC m=+327.850522394" Mar 19 09:30:10.469589 master-0 kubenswrapper[15202]: I0319 09:30:10.469549 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 09:30:10.470747 master-0 kubenswrapper[15202]: I0319 09:30:10.470717 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.473010 master-0 kubenswrapper[15202]: I0319 09:30:10.472981 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-mmblv" Mar 19 09:30:10.473355 master-0 kubenswrapper[15202]: I0319 09:30:10.473341 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 09:30:10.486022 master-0 kubenswrapper[15202]: I0319 09:30:10.485960 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 09:30:10.625323 master-0 kubenswrapper[15202]: I0319 09:30:10.625235 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-var-lock\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.625323 master-0 kubenswrapper[15202]: I0319 09:30:10.625320 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kube-api-access\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.625621 master-0 kubenswrapper[15202]: I0319 09:30:10.625366 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.727714 master-0 kubenswrapper[15202]: I0319 09:30:10.727633 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-var-lock\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.727714 master-0 kubenswrapper[15202]: I0319 09:30:10.727726 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kube-api-access\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.728007 master-0 kubenswrapper[15202]: I0319 09:30:10.727775 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.728007 master-0 kubenswrapper[15202]: I0319 09:30:10.727955 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.728065 master-0 kubenswrapper[15202]: I0319 09:30:10.728005 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-var-lock\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.744916 master-0 kubenswrapper[15202]: I0319 09:30:10.744760 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kube-api-access\") pod \"installer-2-master-0\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:10.801666 master-0 kubenswrapper[15202]: I0319 09:30:10.801597 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:11.292732 master-0 kubenswrapper[15202]: I0319 09:30:11.292536 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 09:30:12.074450 master-0 kubenswrapper[15202]: I0319 09:30:12.074377 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"7e9b2506-dac6-4a23-b2bf-e3ce77919857","Type":"ContainerStarted","Data":"8166208a9de3db4478dd98a7e82a28a5f0fe8ddb5b9e4ea9b0c1e5bc58a5e93f"} Mar 19 09:30:12.074450 master-0 kubenswrapper[15202]: I0319 09:30:12.074439 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"7e9b2506-dac6-4a23-b2bf-e3ce77919857","Type":"ContainerStarted","Data":"de7c9fc4f4b28fcca9ecb1e733759d6854036889522090fe12a41cd2b0b03956"} Mar 19 09:30:12.101994 master-0 kubenswrapper[15202]: I0319 09:30:12.101906 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.1018794 podStartE2EDuration="2.1018794s" podCreationTimestamp="2026-03-19 09:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:12.100300662 +0000 UTC m=+329.485715478" watchObservedRunningTime="2026-03-19 09:30:12.1018794 +0000 UTC m=+329.487294236" Mar 19 09:30:12.894605 master-0 kubenswrapper[15202]: I0319 09:30:12.894504 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-697d79fb97-jrvk4" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" containerID="cri-o://4088813fc22eecbc208070992edf1790b236e8a23cc85474648baf7d6dc7ecb8" gracePeriod=15 Mar 19 09:30:13.090763 master-0 kubenswrapper[15202]: I0319 09:30:13.090711 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-697d79fb97-jrvk4_8157e508-83eb-416e-9c10-f193cd4dbd53/console/0.log" Mar 19 09:30:13.090763 master-0 kubenswrapper[15202]: I0319 09:30:13.090768 15202 generic.go:334] "Generic (PLEG): container finished" podID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerID="4088813fc22eecbc208070992edf1790b236e8a23cc85474648baf7d6dc7ecb8" exitCode=2 Mar 19 09:30:13.091955 master-0 kubenswrapper[15202]: I0319 09:30:13.091928 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697d79fb97-jrvk4" event={"ID":"8157e508-83eb-416e-9c10-f193cd4dbd53","Type":"ContainerDied","Data":"4088813fc22eecbc208070992edf1790b236e8a23cc85474648baf7d6dc7ecb8"} Mar 19 09:30:13.445935 master-0 kubenswrapper[15202]: I0319 09:30:13.445855 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-697d79fb97-jrvk4_8157e508-83eb-416e-9c10-f193cd4dbd53/console/0.log" Mar 19 09:30:13.446439 master-0 kubenswrapper[15202]: I0319 09:30:13.446001 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:30:13.582537 master-0 kubenswrapper[15202]: I0319 09:30:13.582439 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-console-config\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.582966 master-0 kubenswrapper[15202]: I0319 09:30:13.582946 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-serving-cert\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.583698 master-0 kubenswrapper[15202]: I0319 09:30:13.583674 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-oauth-config\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.583864 master-0 kubenswrapper[15202]: I0319 09:30:13.583820 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-trusted-ca-bundle\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.584056 master-0 kubenswrapper[15202]: I0319 09:30:13.584042 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-oauth-serving-cert\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.584160 master-0 kubenswrapper[15202]: I0319 09:30:13.584143 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-service-ca\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.584387 master-0 kubenswrapper[15202]: I0319 09:30:13.584368 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h94j6\" (UniqueName: \"kubernetes.io/projected/8157e508-83eb-416e-9c10-f193cd4dbd53-kube-api-access-h94j6\") pod \"8157e508-83eb-416e-9c10-f193cd4dbd53\" (UID: \"8157e508-83eb-416e-9c10-f193cd4dbd53\") " Mar 19 09:30:13.584733 master-0 kubenswrapper[15202]: I0319 09:30:13.583046 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-console-config" (OuterVolumeSpecName: "console-config") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:13.585247 master-0 kubenswrapper[15202]: I0319 09:30:13.585013 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:13.585388 master-0 kubenswrapper[15202]: I0319 09:30:13.585339 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-service-ca" (OuterVolumeSpecName: "service-ca") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:13.585566 master-0 kubenswrapper[15202]: I0319 09:30:13.585519 15202 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:13.585971 master-0 kubenswrapper[15202]: I0319 09:30:13.585910 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:13.586781 master-0 kubenswrapper[15202]: I0319 09:30:13.586749 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:13.587758 master-0 kubenswrapper[15202]: I0319 09:30:13.587703 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:13.588898 master-0 kubenswrapper[15202]: I0319 09:30:13.588876 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8157e508-83eb-416e-9c10-f193cd4dbd53-kube-api-access-h94j6" (OuterVolumeSpecName: "kube-api-access-h94j6") pod "8157e508-83eb-416e-9c10-f193cd4dbd53" (UID: "8157e508-83eb-416e-9c10-f193cd4dbd53"). InnerVolumeSpecName "kube-api-access-h94j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:13.687342 master-0 kubenswrapper[15202]: I0319 09:30:13.687276 15202 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:13.687342 master-0 kubenswrapper[15202]: I0319 09:30:13.687320 15202 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8157e508-83eb-416e-9c10-f193cd4dbd53-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:13.687342 master-0 kubenswrapper[15202]: I0319 09:30:13.687333 15202 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:13.687342 master-0 kubenswrapper[15202]: I0319 09:30:13.687344 15202 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:13.687342 master-0 kubenswrapper[15202]: I0319 09:30:13.687355 15202 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8157e508-83eb-416e-9c10-f193cd4dbd53-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:13.687820 master-0 kubenswrapper[15202]: I0319 09:30:13.687365 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h94j6\" (UniqueName: \"kubernetes.io/projected/8157e508-83eb-416e-9c10-f193cd4dbd53-kube-api-access-h94j6\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:14.099646 master-0 kubenswrapper[15202]: I0319 09:30:14.099540 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-697d79fb97-jrvk4_8157e508-83eb-416e-9c10-f193cd4dbd53/console/0.log" Mar 19 09:30:14.100444 master-0 kubenswrapper[15202]: I0319 09:30:14.100383 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697d79fb97-jrvk4" event={"ID":"8157e508-83eb-416e-9c10-f193cd4dbd53","Type":"ContainerDied","Data":"d64ac5047f232dcbfc61a55bd66d19afa32b52101b07d274dfb77365432ce423"} Mar 19 09:30:14.100598 master-0 kubenswrapper[15202]: I0319 09:30:14.100516 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697d79fb97-jrvk4" Mar 19 09:30:14.100782 master-0 kubenswrapper[15202]: I0319 09:30:14.100580 15202 scope.go:117] "RemoveContainer" containerID="4088813fc22eecbc208070992edf1790b236e8a23cc85474648baf7d6dc7ecb8" Mar 19 09:30:14.139739 master-0 kubenswrapper[15202]: I0319 09:30:14.139682 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-697d79fb97-jrvk4"] Mar 19 09:30:14.144308 master-0 kubenswrapper[15202]: I0319 09:30:14.144263 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-697d79fb97-jrvk4"] Mar 19 09:30:14.578293 master-0 kubenswrapper[15202]: I0319 09:30:14.578215 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:14.578293 master-0 kubenswrapper[15202]: I0319 09:30:14.578270 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:14.586183 master-0 kubenswrapper[15202]: I0319 09:30:14.586120 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:14.822453 master-0 kubenswrapper[15202]: I0319 09:30:14.822372 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" path="/var/lib/kubelet/pods/8157e508-83eb-416e-9c10-f193cd4dbd53/volumes" Mar 19 09:30:15.124133 master-0 kubenswrapper[15202]: I0319 09:30:15.124043 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:30:15.219321 master-0 kubenswrapper[15202]: I0319 09:30:15.219243 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7988f8bb7-j9w48"] Mar 19 09:30:27.964417 master-0 kubenswrapper[15202]: E0319 09:30:27.963962 15202 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-controller-manager-pod.yaml\": /etc/kubernetes/manifests/kube-controller-manager-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 19 09:30:27.964417 master-0 kubenswrapper[15202]: I0319 09:30:27.964044 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:30:27.964417 master-0 kubenswrapper[15202]: I0319 09:30:27.964384 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" containerID="cri-o://a7909254e1fd575ef7a679770eb6617922c50b1fbb682ef07075bcdacdc5e021" gracePeriod=30 Mar 19 09:30:27.965226 master-0 kubenswrapper[15202]: I0319 09:30:27.964637 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://c07894aa55def3d2147701356df1f2900a277d1378259aefae49e515291dc919" gracePeriod=30 Mar 19 09:30:27.970633 master-0 kubenswrapper[15202]: I0319 09:30:27.970306 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: E0319 09:30:27.970990 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971028 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: E0319 09:30:27.971050 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971058 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: E0319 09:30:27.971073 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971082 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: E0319 09:30:27.971106 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971114 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: E0319 09:30:27.971126 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971133 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: E0319 09:30:27.971148 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971156 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971315 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971334 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971366 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8157e508-83eb-416e-9c10-f193cd4dbd53" containerName="console" Mar 19 09:30:27.971744 master-0 kubenswrapper[15202]: I0319 09:30:27.971385 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 09:30:27.972238 master-0 kubenswrapper[15202]: I0319 09:30:27.971929 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.972238 master-0 kubenswrapper[15202]: I0319 09:30:27.971953 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 09:30:27.973983 master-0 kubenswrapper[15202]: I0319 09:30:27.973947 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.018246 master-0 kubenswrapper[15202]: I0319 09:30:28.018142 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c75dc494b-tvf5c" podUID="aa2134ce-a525-4d94-95ac-bd66b0117834" containerName="console" containerID="cri-o://a1a3e5487a96cd5d9248a53f9af7002e6dfe2f953175099c22fd9212899644d8" gracePeriod=15 Mar 19 09:30:28.068025 master-0 kubenswrapper[15202]: I0319 09:30:28.067969 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.068025 master-0 kubenswrapper[15202]: I0319 09:30:28.068028 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.175295 master-0 kubenswrapper[15202]: I0319 09:30:28.175209 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.175295 master-0 kubenswrapper[15202]: I0319 09:30:28.175290 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.175589 master-0 kubenswrapper[15202]: I0319 09:30:28.175499 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.175681 master-0 kubenswrapper[15202]: I0319 09:30:28.175650 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.248571 master-0 kubenswrapper[15202]: I0319 09:30:28.248394 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c75dc494b-tvf5c_aa2134ce-a525-4d94-95ac-bd66b0117834/console/0.log" Mar 19 09:30:28.248571 master-0 kubenswrapper[15202]: I0319 09:30:28.248453 15202 generic.go:334] "Generic (PLEG): container finished" podID="aa2134ce-a525-4d94-95ac-bd66b0117834" containerID="a1a3e5487a96cd5d9248a53f9af7002e6dfe2f953175099c22fd9212899644d8" exitCode=2 Mar 19 09:30:28.248571 master-0 kubenswrapper[15202]: I0319 09:30:28.248528 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c75dc494b-tvf5c" event={"ID":"aa2134ce-a525-4d94-95ac-bd66b0117834","Type":"ContainerDied","Data":"a1a3e5487a96cd5d9248a53f9af7002e6dfe2f953175099c22fd9212899644d8"} Mar 19 09:30:28.252627 master-0 kubenswrapper[15202]: I0319 09:30:28.251059 15202 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="c07894aa55def3d2147701356df1f2900a277d1378259aefae49e515291dc919" exitCode=0 Mar 19 09:30:28.252627 master-0 kubenswrapper[15202]: I0319 09:30:28.251102 15202 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="a7909254e1fd575ef7a679770eb6617922c50b1fbb682ef07075bcdacdc5e021" exitCode=0 Mar 19 09:30:28.252627 master-0 kubenswrapper[15202]: I0319 09:30:28.251159 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c171ddb54937f0fd497c7f13aa1ee85cbccbc426b3b298a16ea2532494259ede" Mar 19 09:30:28.252627 master-0 kubenswrapper[15202]: I0319 09:30:28.251207 15202 scope.go:117] "RemoveContainer" containerID="001fc753c737d087f54c387541563468e1ec47f8c52877439703afa3d14d7411" Mar 19 09:30:28.254360 master-0 kubenswrapper[15202]: I0319 09:30:28.253753 15202 generic.go:334] "Generic (PLEG): container finished" podID="56655dac-d4e6-47d8-a143-25d1e27102c2" containerID="c11deef3040052065cfaa18854714453aee0c2b75383da3b07acc28a813e9ee0" exitCode=0 Mar 19 09:30:28.254360 master-0 kubenswrapper[15202]: I0319 09:30:28.253815 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"56655dac-d4e6-47d8-a143-25d1e27102c2","Type":"ContainerDied","Data":"c11deef3040052065cfaa18854714453aee0c2b75383da3b07acc28a813e9ee0"} Mar 19 09:30:28.345261 master-0 kubenswrapper[15202]: I0319 09:30:28.345193 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:30:28.354954 master-0 kubenswrapper[15202]: I0319 09:30:28.354884 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:28.360068 master-0 kubenswrapper[15202]: I0319 09:30:28.360015 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:30:28.387936 master-0 kubenswrapper[15202]: I0319 09:30:28.387493 15202 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="9b364b0d-54bc-4f78-920e-9ca2e7b86cbc" Mar 19 09:30:28.503860 master-0 kubenswrapper[15202]: I0319 09:30:28.503739 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:30:28.504119 master-0 kubenswrapper[15202]: I0319 09:30:28.504098 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:30:28.504298 master-0 kubenswrapper[15202]: I0319 09:30:28.503857 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets" (OuterVolumeSpecName: "secrets") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:28.504359 master-0 kubenswrapper[15202]: I0319 09:30:28.504160 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config" (OuterVolumeSpecName: "config") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:28.504359 master-0 kubenswrapper[15202]: I0319 09:30:28.504275 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:30:28.504505 master-0 kubenswrapper[15202]: I0319 09:30:28.504486 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:28.504587 master-0 kubenswrapper[15202]: I0319 09:30:28.504560 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:30:28.504651 master-0 kubenswrapper[15202]: I0319 09:30:28.504595 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 09:30:28.504799 master-0 kubenswrapper[15202]: I0319 09:30:28.504784 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:28.504872 master-0 kubenswrapper[15202]: I0319 09:30:28.504759 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs" (OuterVolumeSpecName: "logs") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:28.505543 master-0 kubenswrapper[15202]: I0319 09:30:28.505507 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.505543 master-0 kubenswrapper[15202]: I0319 09:30:28.505534 15202 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.505698 master-0 kubenswrapper[15202]: I0319 09:30:28.505547 15202 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.505698 master-0 kubenswrapper[15202]: I0319 09:30:28.505559 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.505698 master-0 kubenswrapper[15202]: I0319 09:30:28.505569 15202 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.519583 master-0 kubenswrapper[15202]: I0319 09:30:28.519541 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c75dc494b-tvf5c_aa2134ce-a525-4d94-95ac-bd66b0117834/console/0.log" Mar 19 09:30:28.519899 master-0 kubenswrapper[15202]: I0319 09:30:28.519882 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.609623 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4579z\" (UniqueName: \"kubernetes.io/projected/aa2134ce-a525-4d94-95ac-bd66b0117834-kube-api-access-4579z\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.609761 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-serving-cert\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.609943 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-oauth-config\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.611091 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-oauth-serving-cert\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.611137 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-console-config\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.611173 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-trusted-ca-bundle\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.611503 master-0 kubenswrapper[15202]: I0319 09:30:28.611202 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-service-ca\") pod \"aa2134ce-a525-4d94-95ac-bd66b0117834\" (UID: \"aa2134ce-a525-4d94-95ac-bd66b0117834\") " Mar 19 09:30:28.612093 master-0 kubenswrapper[15202]: I0319 09:30:28.611827 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-console-config" (OuterVolumeSpecName: "console-config") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:28.612093 master-0 kubenswrapper[15202]: I0319 09:30:28.612020 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:28.612093 master-0 kubenswrapper[15202]: I0319 09:30:28.612030 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:28.612926 master-0 kubenswrapper[15202]: I0319 09:30:28.612851 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:28.614849 master-0 kubenswrapper[15202]: I0319 09:30:28.614796 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:28.614991 master-0 kubenswrapper[15202]: I0319 09:30:28.614939 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:28.615943 master-0 kubenswrapper[15202]: I0319 09:30:28.615895 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2134ce-a525-4d94-95ac-bd66b0117834-kube-api-access-4579z" (OuterVolumeSpecName: "kube-api-access-4579z") pod "aa2134ce-a525-4d94-95ac-bd66b0117834" (UID: "aa2134ce-a525-4d94-95ac-bd66b0117834"). InnerVolumeSpecName "kube-api-access-4579z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:28.714692 master-0 kubenswrapper[15202]: I0319 09:30:28.714635 15202 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.714692 master-0 kubenswrapper[15202]: I0319 09:30:28.714690 15202 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.714864 master-0 kubenswrapper[15202]: I0319 09:30:28.714705 15202 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.714864 master-0 kubenswrapper[15202]: I0319 09:30:28.714718 15202 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa2134ce-a525-4d94-95ac-bd66b0117834-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.714864 master-0 kubenswrapper[15202]: I0319 09:30:28.714733 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4579z\" (UniqueName: \"kubernetes.io/projected/aa2134ce-a525-4d94-95ac-bd66b0117834-kube-api-access-4579z\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.714864 master-0 kubenswrapper[15202]: I0319 09:30:28.714746 15202 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.714864 master-0 kubenswrapper[15202]: I0319 09:30:28.714758 15202 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa2134ce-a525-4d94-95ac-bd66b0117834-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:28.823553 master-0 kubenswrapper[15202]: I0319 09:30:28.823481 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f265536aba6292ead501bc9b49f327" path="/var/lib/kubelet/pods/46f265536aba6292ead501bc9b49f327/volumes" Mar 19 09:30:28.824058 master-0 kubenswrapper[15202]: I0319 09:30:28.824024 15202 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 19 09:30:28.890666 master-0 kubenswrapper[15202]: I0319 09:30:28.890565 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:30:28.890666 master-0 kubenswrapper[15202]: I0319 09:30:28.890655 15202 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="9b364b0d-54bc-4f78-920e-9ca2e7b86cbc" Mar 19 09:30:28.894550 master-0 kubenswrapper[15202]: I0319 09:30:28.894491 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 09:30:28.894661 master-0 kubenswrapper[15202]: I0319 09:30:28.894572 15202 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="9b364b0d-54bc-4f78-920e-9ca2e7b86cbc" Mar 19 09:30:29.265895 master-0 kubenswrapper[15202]: I0319 09:30:29.265836 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c75dc494b-tvf5c_aa2134ce-a525-4d94-95ac-bd66b0117834/console/0.log" Mar 19 09:30:29.266591 master-0 kubenswrapper[15202]: I0319 09:30:29.265956 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c75dc494b-tvf5c" event={"ID":"aa2134ce-a525-4d94-95ac-bd66b0117834","Type":"ContainerDied","Data":"8eeabd74c43ce3e703e2735198282c38bc27c954f606b6affb6b2e1f1718b268"} Mar 19 09:30:29.266591 master-0 kubenswrapper[15202]: I0319 09:30:29.266007 15202 scope.go:117] "RemoveContainer" containerID="a1a3e5487a96cd5d9248a53f9af7002e6dfe2f953175099c22fd9212899644d8" Mar 19 09:30:29.266591 master-0 kubenswrapper[15202]: I0319 09:30:29.266128 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c75dc494b-tvf5c" Mar 19 09:30:29.280491 master-0 kubenswrapper[15202]: I0319 09:30:29.280429 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 09:30:29.294064 master-0 kubenswrapper[15202]: I0319 09:30:29.293904 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"3c9753a11d434b49ac1c7706c0cfb9ad45a06cf5f0dedce5c7137c69786d006a"} Mar 19 09:30:29.294064 master-0 kubenswrapper[15202]: I0319 09:30:29.293988 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"ce05bf6050be57f679d8808c21c216584cff22bbd6c73ce590810a791f17b78b"} Mar 19 09:30:29.294064 master-0 kubenswrapper[15202]: I0319 09:30:29.294007 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"aa9ca5f81be4d21b4000c8f0fdd07fdc216eb78f16c7cfa49b4fbe85f9057a8c"} Mar 19 09:30:29.294064 master-0 kubenswrapper[15202]: I0319 09:30:29.294023 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"78f1c2e280836c1221080a867e5d75e5d53fea7242964b76feeee5cd30e104dd"} Mar 19 09:30:29.340625 master-0 kubenswrapper[15202]: I0319 09:30:29.340555 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c75dc494b-tvf5c"] Mar 19 09:30:29.350804 master-0 kubenswrapper[15202]: I0319 09:30:29.350734 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c75dc494b-tvf5c"] Mar 19 09:30:29.685513 master-0 kubenswrapper[15202]: I0319 09:30:29.685387 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:30:29.835376 master-0 kubenswrapper[15202]: I0319 09:30:29.835284 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56655dac-d4e6-47d8-a143-25d1e27102c2-kube-api-access\") pod \"56655dac-d4e6-47d8-a143-25d1e27102c2\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " Mar 19 09:30:29.835376 master-0 kubenswrapper[15202]: I0319 09:30:29.835385 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-kubelet-dir\") pod \"56655dac-d4e6-47d8-a143-25d1e27102c2\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " Mar 19 09:30:29.835753 master-0 kubenswrapper[15202]: I0319 09:30:29.835530 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-var-lock\") pod \"56655dac-d4e6-47d8-a143-25d1e27102c2\" (UID: \"56655dac-d4e6-47d8-a143-25d1e27102c2\") " Mar 19 09:30:29.835944 master-0 kubenswrapper[15202]: I0319 09:30:29.835898 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "56655dac-d4e6-47d8-a143-25d1e27102c2" (UID: "56655dac-d4e6-47d8-a143-25d1e27102c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:29.836049 master-0 kubenswrapper[15202]: I0319 09:30:29.836015 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-var-lock" (OuterVolumeSpecName: "var-lock") pod "56655dac-d4e6-47d8-a143-25d1e27102c2" (UID: "56655dac-d4e6-47d8-a143-25d1e27102c2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:29.838967 master-0 kubenswrapper[15202]: I0319 09:30:29.838695 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56655dac-d4e6-47d8-a143-25d1e27102c2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "56655dac-d4e6-47d8-a143-25d1e27102c2" (UID: "56655dac-d4e6-47d8-a143-25d1e27102c2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:29.938733 master-0 kubenswrapper[15202]: I0319 09:30:29.938675 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56655dac-d4e6-47d8-a143-25d1e27102c2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:29.938733 master-0 kubenswrapper[15202]: I0319 09:30:29.938717 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:29.938733 master-0 kubenswrapper[15202]: I0319 09:30:29.938727 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/56655dac-d4e6-47d8-a143-25d1e27102c2-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:30.311638 master-0 kubenswrapper[15202]: I0319 09:30:30.311368 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"104d66d823f67f2d0db81952b3e75346a0594dd7f2e33f5fb4f808501d9d251d"} Mar 19 09:30:30.315984 master-0 kubenswrapper[15202]: I0319 09:30:30.315912 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"56655dac-d4e6-47d8-a143-25d1e27102c2","Type":"ContainerDied","Data":"86618ed9204ff2f9a9c022c4697e952722ccef95392aa3786a84ebe71dbefe40"} Mar 19 09:30:30.315984 master-0 kubenswrapper[15202]: I0319 09:30:30.315974 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86618ed9204ff2f9a9c022c4697e952722ccef95392aa3786a84ebe71dbefe40" Mar 19 09:30:30.315984 master-0 kubenswrapper[15202]: I0319 09:30:30.315942 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 19 09:30:30.343332 master-0 kubenswrapper[15202]: I0319 09:30:30.342262 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.342234475 podStartE2EDuration="2.342234475s" podCreationTimestamp="2026-03-19 09:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:30.337680845 +0000 UTC m=+347.723095661" watchObservedRunningTime="2026-03-19 09:30:30.342234475 +0000 UTC m=+347.727649291" Mar 19 09:30:30.822040 master-0 kubenswrapper[15202]: I0319 09:30:30.821944 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2134ce-a525-4d94-95ac-bd66b0117834" path="/var/lib/kubelet/pods/aa2134ce-a525-4d94-95ac-bd66b0117834/volumes" Mar 19 09:30:38.356459 master-0 kubenswrapper[15202]: I0319 09:30:38.356336 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.356459 master-0 kubenswrapper[15202]: I0319 09:30:38.356443 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.356459 master-0 kubenswrapper[15202]: I0319 09:30:38.356504 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.357941 master-0 kubenswrapper[15202]: I0319 09:30:38.356538 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.362160 master-0 kubenswrapper[15202]: I0319 09:30:38.362069 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.363058 master-0 kubenswrapper[15202]: I0319 09:30:38.363018 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.377225 master-0 kubenswrapper[15202]: I0319 09:30:38.377159 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:38.379356 master-0 kubenswrapper[15202]: I0319 09:30:38.379312 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:30:40.266945 master-0 kubenswrapper[15202]: I0319 09:30:40.266843 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7988f8bb7-j9w48" podUID="5ba72c91-3222-4576-b61a-1138e693508c" containerName="console" containerID="cri-o://d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946" gracePeriod=15 Mar 19 09:30:40.815287 master-0 kubenswrapper[15202]: I0319 09:30:40.815247 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7988f8bb7-j9w48_5ba72c91-3222-4576-b61a-1138e693508c/console/0.log" Mar 19 09:30:40.815660 master-0 kubenswrapper[15202]: I0319 09:30:40.815642 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884204 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-oauth-serving-cert\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884297 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9flx\" (UniqueName: \"kubernetes.io/projected/5ba72c91-3222-4576-b61a-1138e693508c-kube-api-access-h9flx\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884614 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-oauth-config\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884648 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-trusted-ca-bundle\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884675 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-service-ca\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884745 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-console-config\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886317 master-0 kubenswrapper[15202]: I0319 09:30:40.884796 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-serving-cert\") pod \"5ba72c91-3222-4576-b61a-1138e693508c\" (UID: \"5ba72c91-3222-4576-b61a-1138e693508c\") " Mar 19 09:30:40.886923 master-0 kubenswrapper[15202]: I0319 09:30:40.886507 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:40.890066 master-0 kubenswrapper[15202]: I0319 09:30:40.889928 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba72c91-3222-4576-b61a-1138e693508c-kube-api-access-h9flx" (OuterVolumeSpecName: "kube-api-access-h9flx") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "kube-api-access-h9flx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:40.890528 master-0 kubenswrapper[15202]: I0319 09:30:40.890261 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-service-ca" (OuterVolumeSpecName: "service-ca") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:40.890602 master-0 kubenswrapper[15202]: I0319 09:30:40.890548 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-console-config" (OuterVolumeSpecName: "console-config") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:40.891670 master-0 kubenswrapper[15202]: I0319 09:30:40.891610 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:40.894186 master-0 kubenswrapper[15202]: I0319 09:30:40.894152 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:40.895682 master-0 kubenswrapper[15202]: I0319 09:30:40.895616 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5ba72c91-3222-4576-b61a-1138e693508c" (UID: "5ba72c91-3222-4576-b61a-1138e693508c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:40.987411 master-0 kubenswrapper[15202]: I0319 09:30:40.987341 15202 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:40.987411 master-0 kubenswrapper[15202]: I0319 09:30:40.987402 15202 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:40.987411 master-0 kubenswrapper[15202]: I0319 09:30:40.987417 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9flx\" (UniqueName: \"kubernetes.io/projected/5ba72c91-3222-4576-b61a-1138e693508c-kube-api-access-h9flx\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:40.987411 master-0 kubenswrapper[15202]: I0319 09:30:40.987430 15202 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ba72c91-3222-4576-b61a-1138e693508c-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:40.987854 master-0 kubenswrapper[15202]: I0319 09:30:40.987443 15202 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:40.987854 master-0 kubenswrapper[15202]: I0319 09:30:40.987458 15202 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:40.987854 master-0 kubenswrapper[15202]: I0319 09:30:40.987485 15202 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ba72c91-3222-4576-b61a-1138e693508c-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:41.400503 master-0 kubenswrapper[15202]: I0319 09:30:41.400430 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7988f8bb7-j9w48_5ba72c91-3222-4576-b61a-1138e693508c/console/0.log" Mar 19 09:30:41.401176 master-0 kubenswrapper[15202]: I0319 09:30:41.400531 15202 generic.go:334] "Generic (PLEG): container finished" podID="5ba72c91-3222-4576-b61a-1138e693508c" containerID="d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946" exitCode=2 Mar 19 09:30:41.401176 master-0 kubenswrapper[15202]: I0319 09:30:41.400581 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7988f8bb7-j9w48" event={"ID":"5ba72c91-3222-4576-b61a-1138e693508c","Type":"ContainerDied","Data":"d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946"} Mar 19 09:30:41.401176 master-0 kubenswrapper[15202]: I0319 09:30:41.400622 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7988f8bb7-j9w48" event={"ID":"5ba72c91-3222-4576-b61a-1138e693508c","Type":"ContainerDied","Data":"48f0e513fa7049ede31558465687d2f9534f116ef2503e2e44bf6ac258ab8b42"} Mar 19 09:30:41.401176 master-0 kubenswrapper[15202]: I0319 09:30:41.400647 15202 scope.go:117] "RemoveContainer" containerID="d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946" Mar 19 09:30:41.401176 master-0 kubenswrapper[15202]: I0319 09:30:41.400800 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7988f8bb7-j9w48" Mar 19 09:30:41.423480 master-0 kubenswrapper[15202]: I0319 09:30:41.419853 15202 scope.go:117] "RemoveContainer" containerID="d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946" Mar 19 09:30:41.426866 master-0 kubenswrapper[15202]: E0319 09:30:41.426814 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946\": container with ID starting with d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946 not found: ID does not exist" containerID="d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946" Mar 19 09:30:41.426960 master-0 kubenswrapper[15202]: I0319 09:30:41.426857 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946"} err="failed to get container status \"d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946\": rpc error: code = NotFound desc = could not find container \"d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946\": container with ID starting with d71ac90526968172dfbd12c61285cf00a6e1aeab1b4966a8ca0c9460b35ab946 not found: ID does not exist" Mar 19 09:30:41.455995 master-0 kubenswrapper[15202]: I0319 09:30:41.455914 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7988f8bb7-j9w48"] Mar 19 09:30:41.462886 master-0 kubenswrapper[15202]: I0319 09:30:41.462828 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7988f8bb7-j9w48"] Mar 19 09:30:42.709508 master-0 kubenswrapper[15202]: I0319 09:30:42.707450 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:30:42.710895 master-0 kubenswrapper[15202]: I0319 09:30:42.710836 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" containerID="cri-o://2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6" gracePeriod=30 Mar 19 09:30:42.711064 master-0 kubenswrapper[15202]: I0319 09:30:42.710946 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" containerID="cri-o://9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a" gracePeriod=30 Mar 19 09:30:42.711153 master-0 kubenswrapper[15202]: I0319 09:30:42.711002 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" containerID="cri-o://f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2" gracePeriod=30 Mar 19 09:30:42.711347 master-0 kubenswrapper[15202]: I0319 09:30:42.711196 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" containerID="cri-o://bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af" gracePeriod=30 Mar 19 09:30:42.711994 master-0 kubenswrapper[15202]: I0319 09:30:42.711351 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" containerID="cri-o://1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566" gracePeriod=30 Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: I0319 09:30:42.712222 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: E0319 09:30:42.712981 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: I0319 09:30:42.713010 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: E0319 09:30:42.713038 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56655dac-d4e6-47d8-a143-25d1e27102c2" containerName="installer" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: I0319 09:30:42.713050 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56655dac-d4e6-47d8-a143-25d1e27102c2" containerName="installer" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: E0319 09:30:42.713074 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: I0319 09:30:42.713086 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: E0319 09:30:42.713118 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: I0319 09:30:42.713130 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: E0319 09:30:42.713153 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: I0319 09:30:42.713164 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 09:30:42.713180 master-0 kubenswrapper[15202]: E0319 09:30:42.713190 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2134ce-a525-4d94-95ac-bd66b0117834" containerName="console" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713202 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2134ce-a525-4d94-95ac-bd66b0117834" containerName="console" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: E0319 09:30:42.713226 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713237 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: E0319 09:30:42.713255 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba72c91-3222-4576-b61a-1138e693508c" containerName="console" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713267 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba72c91-3222-4576-b61a-1138e693508c" containerName="console" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: E0319 09:30:42.713283 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713293 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: E0319 09:30:42.713310 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713321 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: E0319 09:30:42.713341 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713354 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713744 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56655dac-d4e6-47d8-a143-25d1e27102c2" containerName="installer" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713792 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713821 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713842 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713859 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713872 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713933 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713954 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba72c91-3222-4576-b61a-1138e693508c" containerName="console" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.713975 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.714010 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 09:30:42.714260 master-0 kubenswrapper[15202]: I0319 09:30:42.714035 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2134ce-a525-4d94-95ac-bd66b0117834" containerName="console" Mar 19 09:30:42.823210 master-0 kubenswrapper[15202]: I0319 09:30:42.823136 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.823210 master-0 kubenswrapper[15202]: I0319 09:30:42.823205 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.823599 master-0 kubenswrapper[15202]: I0319 09:30:42.823261 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.823599 master-0 kubenswrapper[15202]: I0319 09:30:42.823340 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.823599 master-0 kubenswrapper[15202]: I0319 09:30:42.823371 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.823599 master-0 kubenswrapper[15202]: I0319 09:30:42.823391 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.824141 master-0 kubenswrapper[15202]: I0319 09:30:42.824090 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba72c91-3222-4576-b61a-1138e693508c" path="/var/lib/kubelet/pods/5ba72c91-3222-4576-b61a-1138e693508c/volumes" Mar 19 09:30:42.925524 master-0 kubenswrapper[15202]: I0319 09:30:42.925417 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925524 master-0 kubenswrapper[15202]: I0319 09:30:42.925490 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925559 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925617 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925758 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925799 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925821 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925843 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925883 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.925998 master-0 kubenswrapper[15202]: I0319 09:30:42.925888 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.926581 master-0 kubenswrapper[15202]: I0319 09:30:42.926031 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:42.926581 master-0 kubenswrapper[15202]: I0319 09:30:42.926115 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 09:30:43.438224 master-0 kubenswrapper[15202]: I0319 09:30:43.437958 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:30:43.439758 master-0 kubenswrapper[15202]: I0319 09:30:43.439718 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:30:43.442014 master-0 kubenswrapper[15202]: I0319 09:30:43.441964 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a" exitCode=2 Mar 19 09:30:43.442014 master-0 kubenswrapper[15202]: I0319 09:30:43.441997 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566" exitCode=0 Mar 19 09:30:43.442014 master-0 kubenswrapper[15202]: I0319 09:30:43.442006 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2" exitCode=2 Mar 19 09:30:43.841917 master-0 kubenswrapper[15202]: I0319 09:30:43.841758 15202 scope.go:117] "RemoveContainer" containerID="a7909254e1fd575ef7a679770eb6617922c50b1fbb682ef07075bcdacdc5e021" Mar 19 09:30:51.759659 master-0 kubenswrapper[15202]: I0319 09:30:51.759518 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:30:57.589231 master-0 kubenswrapper[15202]: I0319 09:30:57.589164 15202 generic.go:334] "Generic (PLEG): container finished" podID="7e9b2506-dac6-4a23-b2bf-e3ce77919857" containerID="8166208a9de3db4478dd98a7e82a28a5f0fe8ddb5b9e4ea9b0c1e5bc58a5e93f" exitCode=0 Mar 19 09:30:57.589854 master-0 kubenswrapper[15202]: I0319 09:30:57.589247 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"7e9b2506-dac6-4a23-b2bf-e3ce77919857","Type":"ContainerDied","Data":"8166208a9de3db4478dd98a7e82a28a5f0fe8ddb5b9e4ea9b0c1e5bc58a5e93f"} Mar 19 09:30:58.947413 master-0 kubenswrapper[15202]: I0319 09:30:58.947364 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:30:59.144241 master-0 kubenswrapper[15202]: I0319 09:30:59.144082 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kube-api-access\") pod \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " Mar 19 09:30:59.144241 master-0 kubenswrapper[15202]: I0319 09:30:59.144151 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-var-lock\") pod \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " Mar 19 09:30:59.144652 master-0 kubenswrapper[15202]: I0319 09:30:59.144254 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kubelet-dir\") pod \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\" (UID: \"7e9b2506-dac6-4a23-b2bf-e3ce77919857\") " Mar 19 09:30:59.144652 master-0 kubenswrapper[15202]: I0319 09:30:59.144499 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-var-lock" (OuterVolumeSpecName: "var-lock") pod "7e9b2506-dac6-4a23-b2bf-e3ce77919857" (UID: "7e9b2506-dac6-4a23-b2bf-e3ce77919857"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:59.144652 master-0 kubenswrapper[15202]: I0319 09:30:59.144540 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7e9b2506-dac6-4a23-b2bf-e3ce77919857" (UID: "7e9b2506-dac6-4a23-b2bf-e3ce77919857"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:30:59.152156 master-0 kubenswrapper[15202]: I0319 09:30:59.152086 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7e9b2506-dac6-4a23-b2bf-e3ce77919857" (UID: "7e9b2506-dac6-4a23-b2bf-e3ce77919857"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:59.246611 master-0 kubenswrapper[15202]: I0319 09:30:59.246527 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:59.246611 master-0 kubenswrapper[15202]: I0319 09:30:59.246590 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e9b2506-dac6-4a23-b2bf-e3ce77919857-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:59.246611 master-0 kubenswrapper[15202]: I0319 09:30:59.246612 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e9b2506-dac6-4a23-b2bf-e3ce77919857-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:30:59.606825 master-0 kubenswrapper[15202]: I0319 09:30:59.606763 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"7e9b2506-dac6-4a23-b2bf-e3ce77919857","Type":"ContainerDied","Data":"de7c9fc4f4b28fcca9ecb1e733759d6854036889522090fe12a41cd2b0b03956"} Mar 19 09:30:59.606825 master-0 kubenswrapper[15202]: I0319 09:30:59.606812 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de7c9fc4f4b28fcca9ecb1e733759d6854036889522090fe12a41cd2b0b03956" Mar 19 09:30:59.607187 master-0 kubenswrapper[15202]: I0319 09:30:59.606836 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 09:31:00.574222 master-0 kubenswrapper[15202]: E0319 09:31:00.574136 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:05.295743 master-0 kubenswrapper[15202]: E0319 09:31:05.295657 15202 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:30:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:10.575141 master-0 kubenswrapper[15202]: E0319 09:31:10.575024 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:13.320231 master-0 kubenswrapper[15202]: I0319 09:31:13.320161 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:31:13.321720 master-0 kubenswrapper[15202]: I0319 09:31:13.321644 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:31:13.322815 master-0 kubenswrapper[15202]: I0319 09:31:13.322763 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 09:31:13.324173 master-0 kubenswrapper[15202]: I0319 09:31:13.324118 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:13.410496 master-0 kubenswrapper[15202]: I0319 09:31:13.410422 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410566 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410614 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir" (OuterVolumeSpecName: "data-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410675 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir" (OuterVolumeSpecName: "log-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410647 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410649 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410724 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410743 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410769 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410794 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:31:13.410862 master-0 kubenswrapper[15202]: I0319 09:31:13.410816 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.410907 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.411136 15202 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.411156 15202 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.411165 15202 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.411174 15202 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.411182 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:31:13.411324 master-0 kubenswrapper[15202]: I0319 09:31:13.411190 15202 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:31:13.724890 master-0 kubenswrapper[15202]: I0319 09:31:13.724848 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 09:31:13.726389 master-0 kubenswrapper[15202]: I0319 09:31:13.726354 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 09:31:13.727454 master-0 kubenswrapper[15202]: I0319 09:31:13.727439 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 09:31:13.729158 master-0 kubenswrapper[15202]: I0319 09:31:13.729067 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af" exitCode=0 Mar 19 09:31:13.729158 master-0 kubenswrapper[15202]: I0319 09:31:13.729124 15202 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6" exitCode=137 Mar 19 09:31:13.729276 master-0 kubenswrapper[15202]: I0319 09:31:13.729194 15202 scope.go:117] "RemoveContainer" containerID="9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a" Mar 19 09:31:13.729276 master-0 kubenswrapper[15202]: I0319 09:31:13.729233 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:13.751602 master-0 kubenswrapper[15202]: I0319 09:31:13.751547 15202 scope.go:117] "RemoveContainer" containerID="1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566" Mar 19 09:31:13.780754 master-0 kubenswrapper[15202]: I0319 09:31:13.780707 15202 scope.go:117] "RemoveContainer" containerID="f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2" Mar 19 09:31:13.795963 master-0 kubenswrapper[15202]: I0319 09:31:13.795920 15202 scope.go:117] "RemoveContainer" containerID="bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af" Mar 19 09:31:13.813074 master-0 kubenswrapper[15202]: I0319 09:31:13.813010 15202 scope.go:117] "RemoveContainer" containerID="2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6" Mar 19 09:31:13.826547 master-0 kubenswrapper[15202]: I0319 09:31:13.826508 15202 scope.go:117] "RemoveContainer" containerID="45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9" Mar 19 09:31:13.879306 master-0 kubenswrapper[15202]: I0319 09:31:13.879253 15202 scope.go:117] "RemoveContainer" containerID="630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f" Mar 19 09:31:13.899165 master-0 kubenswrapper[15202]: I0319 09:31:13.895723 15202 scope.go:117] "RemoveContainer" containerID="4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391" Mar 19 09:31:13.918157 master-0 kubenswrapper[15202]: I0319 09:31:13.918098 15202 scope.go:117] "RemoveContainer" containerID="9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a" Mar 19 09:31:13.919717 master-0 kubenswrapper[15202]: E0319 09:31:13.919675 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a\": container with ID starting with 9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a not found: ID does not exist" containerID="9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a" Mar 19 09:31:13.919887 master-0 kubenswrapper[15202]: I0319 09:31:13.919717 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a"} err="failed to get container status \"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a\": rpc error: code = NotFound desc = could not find container \"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a\": container with ID starting with 9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a not found: ID does not exist" Mar 19 09:31:13.919887 master-0 kubenswrapper[15202]: I0319 09:31:13.919748 15202 scope.go:117] "RemoveContainer" containerID="1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566" Mar 19 09:31:13.920424 master-0 kubenswrapper[15202]: E0319 09:31:13.920375 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566\": container with ID starting with 1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566 not found: ID does not exist" containerID="1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566" Mar 19 09:31:13.920508 master-0 kubenswrapper[15202]: I0319 09:31:13.920432 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566"} err="failed to get container status \"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566\": rpc error: code = NotFound desc = could not find container \"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566\": container with ID starting with 1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566 not found: ID does not exist" Mar 19 09:31:13.920508 master-0 kubenswrapper[15202]: I0319 09:31:13.920464 15202 scope.go:117] "RemoveContainer" containerID="f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2" Mar 19 09:31:13.921052 master-0 kubenswrapper[15202]: E0319 09:31:13.921024 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2\": container with ID starting with f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2 not found: ID does not exist" containerID="f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2" Mar 19 09:31:13.921100 master-0 kubenswrapper[15202]: I0319 09:31:13.921058 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2"} err="failed to get container status \"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2\": rpc error: code = NotFound desc = could not find container \"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2\": container with ID starting with f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2 not found: ID does not exist" Mar 19 09:31:13.921100 master-0 kubenswrapper[15202]: I0319 09:31:13.921079 15202 scope.go:117] "RemoveContainer" containerID="bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af" Mar 19 09:31:13.921448 master-0 kubenswrapper[15202]: E0319 09:31:13.921418 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af\": container with ID starting with bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af not found: ID does not exist" containerID="bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af" Mar 19 09:31:13.921528 master-0 kubenswrapper[15202]: I0319 09:31:13.921453 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af"} err="failed to get container status \"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af\": rpc error: code = NotFound desc = could not find container \"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af\": container with ID starting with bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af not found: ID does not exist" Mar 19 09:31:13.921528 master-0 kubenswrapper[15202]: I0319 09:31:13.921484 15202 scope.go:117] "RemoveContainer" containerID="2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6" Mar 19 09:31:13.921935 master-0 kubenswrapper[15202]: E0319 09:31:13.921895 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6\": container with ID starting with 2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6 not found: ID does not exist" containerID="2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6" Mar 19 09:31:13.921999 master-0 kubenswrapper[15202]: I0319 09:31:13.921943 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6"} err="failed to get container status \"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6\": rpc error: code = NotFound desc = could not find container \"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6\": container with ID starting with 2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6 not found: ID does not exist" Mar 19 09:31:13.921999 master-0 kubenswrapper[15202]: I0319 09:31:13.921978 15202 scope.go:117] "RemoveContainer" containerID="45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9" Mar 19 09:31:13.922689 master-0 kubenswrapper[15202]: E0319 09:31:13.922657 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9\": container with ID starting with 45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9 not found: ID does not exist" containerID="45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9" Mar 19 09:31:13.922756 master-0 kubenswrapper[15202]: I0319 09:31:13.922691 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9"} err="failed to get container status \"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9\": rpc error: code = NotFound desc = could not find container \"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9\": container with ID starting with 45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9 not found: ID does not exist" Mar 19 09:31:13.922756 master-0 kubenswrapper[15202]: I0319 09:31:13.922714 15202 scope.go:117] "RemoveContainer" containerID="630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f" Mar 19 09:31:13.923100 master-0 kubenswrapper[15202]: E0319 09:31:13.923065 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f\": container with ID starting with 630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f not found: ID does not exist" containerID="630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f" Mar 19 09:31:13.923151 master-0 kubenswrapper[15202]: I0319 09:31:13.923098 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f"} err="failed to get container status \"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f\": rpc error: code = NotFound desc = could not find container \"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f\": container with ID starting with 630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f not found: ID does not exist" Mar 19 09:31:13.923151 master-0 kubenswrapper[15202]: I0319 09:31:13.923117 15202 scope.go:117] "RemoveContainer" containerID="4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391" Mar 19 09:31:13.923756 master-0 kubenswrapper[15202]: E0319 09:31:13.923696 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391\": container with ID starting with 4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391 not found: ID does not exist" containerID="4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391" Mar 19 09:31:13.923807 master-0 kubenswrapper[15202]: I0319 09:31:13.923761 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391"} err="failed to get container status \"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391\": rpc error: code = NotFound desc = could not find container \"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391\": container with ID starting with 4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391 not found: ID does not exist" Mar 19 09:31:13.923807 master-0 kubenswrapper[15202]: I0319 09:31:13.923789 15202 scope.go:117] "RemoveContainer" containerID="9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a" Mar 19 09:31:13.924128 master-0 kubenswrapper[15202]: I0319 09:31:13.924093 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a"} err="failed to get container status \"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a\": rpc error: code = NotFound desc = could not find container \"9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a\": container with ID starting with 9286e8133f6ba77ee702ab4f138ac839f7a5c86d58ddbabe6f88132b0c10ba1a not found: ID does not exist" Mar 19 09:31:13.924128 master-0 kubenswrapper[15202]: I0319 09:31:13.924120 15202 scope.go:117] "RemoveContainer" containerID="1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566" Mar 19 09:31:13.924385 master-0 kubenswrapper[15202]: I0319 09:31:13.924346 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566"} err="failed to get container status \"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566\": rpc error: code = NotFound desc = could not find container \"1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566\": container with ID starting with 1f83cbdd4826a43f51dafa812d978653f16293769849982ddc10062602a22566 not found: ID does not exist" Mar 19 09:31:13.924385 master-0 kubenswrapper[15202]: I0319 09:31:13.924376 15202 scope.go:117] "RemoveContainer" containerID="f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2" Mar 19 09:31:13.924950 master-0 kubenswrapper[15202]: I0319 09:31:13.924908 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2"} err="failed to get container status \"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2\": rpc error: code = NotFound desc = could not find container \"f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2\": container with ID starting with f33a3ba977a9bd0af2c09d4851f4c722c5ca130ddc24bd743ee2167f622258e2 not found: ID does not exist" Mar 19 09:31:13.924950 master-0 kubenswrapper[15202]: I0319 09:31:13.924943 15202 scope.go:117] "RemoveContainer" containerID="bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af" Mar 19 09:31:13.925279 master-0 kubenswrapper[15202]: I0319 09:31:13.925247 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af"} err="failed to get container status \"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af\": rpc error: code = NotFound desc = could not find container \"bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af\": container with ID starting with bbf349cb2e0fcbfbde74aad4c53f7cbbc5279fd4c2b5871453bcbdda7a06a9af not found: ID does not exist" Mar 19 09:31:13.925279 master-0 kubenswrapper[15202]: I0319 09:31:13.925271 15202 scope.go:117] "RemoveContainer" containerID="2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6" Mar 19 09:31:13.925668 master-0 kubenswrapper[15202]: I0319 09:31:13.925640 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6"} err="failed to get container status \"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6\": rpc error: code = NotFound desc = could not find container \"2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6\": container with ID starting with 2bde28772d427727019c9255ebfe58ec028428554f225c40fce659c7d10111f6 not found: ID does not exist" Mar 19 09:31:13.925730 master-0 kubenswrapper[15202]: I0319 09:31:13.925669 15202 scope.go:117] "RemoveContainer" containerID="45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9" Mar 19 09:31:13.926031 master-0 kubenswrapper[15202]: I0319 09:31:13.926001 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9"} err="failed to get container status \"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9\": rpc error: code = NotFound desc = could not find container \"45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9\": container with ID starting with 45860d9f89e763cf0b028c718d3a5fcda2f135f666f1b23a1a345cfeaf8139e9 not found: ID does not exist" Mar 19 09:31:13.926087 master-0 kubenswrapper[15202]: I0319 09:31:13.926028 15202 scope.go:117] "RemoveContainer" containerID="630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f" Mar 19 09:31:13.926520 master-0 kubenswrapper[15202]: I0319 09:31:13.926494 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f"} err="failed to get container status \"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f\": rpc error: code = NotFound desc = could not find container \"630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f\": container with ID starting with 630efeb085db0e9f34a80beaace05561b0c40984980e54241832dedddcb71f9f not found: ID does not exist" Mar 19 09:31:13.926583 master-0 kubenswrapper[15202]: I0319 09:31:13.926518 15202 scope.go:117] "RemoveContainer" containerID="4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391" Mar 19 09:31:13.926969 master-0 kubenswrapper[15202]: I0319 09:31:13.926945 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391"} err="failed to get container status \"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391\": rpc error: code = NotFound desc = could not find container \"4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391\": container with ID starting with 4a61e313fe61b71fbe9ecdb72eeb3947f0b12cab39e453fe17b96b5b277b7391 not found: ID does not exist" Mar 19 09:31:14.826404 master-0 kubenswrapper[15202]: I0319 09:31:14.826315 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b4ed170d527099878cb5fdd508a2fb" path="/var/lib/kubelet/pods/24b4ed170d527099878cb5fdd508a2fb/volumes" Mar 19 09:31:15.296886 master-0 kubenswrapper[15202]: E0319 09:31:15.296818 15202 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:16.737406 master-0 kubenswrapper[15202]: E0319 09:31:16.737251 15202 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3426635b7669 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:30:42.710894185 +0000 UTC m=+360.096309041,LastTimestamp:2026-03-19 09:30:42.710894185 +0000 UTC m=+360.096309041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:31:19.811504 master-0 kubenswrapper[15202]: I0319 09:31:19.811387 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:19.834034 master-0 kubenswrapper[15202]: I0319 09:31:19.833949 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:31:19.834034 master-0 kubenswrapper[15202]: I0319 09:31:19.834032 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:31:20.575566 master-0 kubenswrapper[15202]: E0319 09:31:20.575490 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 19 09:31:25.304542 master-0 kubenswrapper[15202]: E0319 09:31:25.298143 15202 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:30.576737 master-0 kubenswrapper[15202]: E0319 09:31:30.576621 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:33.904075 master-0 kubenswrapper[15202]: I0319 09:31:33.903992 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kqb2h_b2898746-6827-41d9-ac88-64206cb84ac9/approver/1.log" Mar 19 09:31:33.905040 master-0 kubenswrapper[15202]: I0319 09:31:33.904994 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kqb2h_b2898746-6827-41d9-ac88-64206cb84ac9/approver/0.log" Mar 19 09:31:33.905582 master-0 kubenswrapper[15202]: I0319 09:31:33.905521 15202 generic.go:334] "Generic (PLEG): container finished" podID="b2898746-6827-41d9-ac88-64206cb84ac9" containerID="93d35b0e89d31207bfcd7222380f4dd439dd98f18c6725c9186b6ad660b61c77" exitCode=1 Mar 19 09:31:33.905647 master-0 kubenswrapper[15202]: I0319 09:31:33.905595 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerDied","Data":"93d35b0e89d31207bfcd7222380f4dd439dd98f18c6725c9186b6ad660b61c77"} Mar 19 09:31:33.905697 master-0 kubenswrapper[15202]: I0319 09:31:33.905657 15202 scope.go:117] "RemoveContainer" containerID="5f66b7b4498be8ffcef1be07d5415ae49ca99cf0c15b74518d97c2537613d5cc" Mar 19 09:31:33.906820 master-0 kubenswrapper[15202]: I0319 09:31:33.906767 15202 scope.go:117] "RemoveContainer" containerID="93d35b0e89d31207bfcd7222380f4dd439dd98f18c6725c9186b6ad660b61c77" Mar 19 09:31:34.920372 master-0 kubenswrapper[15202]: I0319 09:31:34.920281 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kqb2h_b2898746-6827-41d9-ac88-64206cb84ac9/approver/1.log" Mar 19 09:31:34.920974 master-0 kubenswrapper[15202]: I0319 09:31:34.920937 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kqb2h" event={"ID":"b2898746-6827-41d9-ac88-64206cb84ac9","Type":"ContainerStarted","Data":"0659d8110bdccd9562c163059a64982a1188fb13f3e7d1207907c387c87a4f32"} Mar 19 09:31:35.299189 master-0 kubenswrapper[15202]: E0319 09:31:35.299111 15202 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:40.578364 master-0 kubenswrapper[15202]: E0319 09:31:40.578116 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:40.578364 master-0 kubenswrapper[15202]: I0319 09:31:40.578246 15202 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:31:42.817110 master-0 kubenswrapper[15202]: I0319 09:31:42.817034 15202 status_manager.go:851] "Failed to get status for pod" podUID="24b4ed170d527099878cb5fdd508a2fb" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 19 09:31:45.300172 master-0 kubenswrapper[15202]: E0319 09:31:45.300081 15202 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:31:45.300172 master-0 kubenswrapper[15202]: E0319 09:31:45.300132 15202 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:31:50.578659 master-0 kubenswrapper[15202]: E0319 09:31:50.578565 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 19 09:31:50.740888 master-0 kubenswrapper[15202]: E0319 09:31:50.740683 15202 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3426635c490a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:30:42.710948106 +0000 UTC m=+360.096362942,LastTimestamp:2026-03-19 09:30:42.710948106 +0000 UTC m=+360.096362942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:31:53.837728 master-0 kubenswrapper[15202]: E0319 09:31:53.837675 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:53.838837 master-0 kubenswrapper[15202]: I0319 09:31:53.838822 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 09:31:53.873118 master-0 kubenswrapper[15202]: W0319 09:31:53.873016 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094204df314fe45bd5af12ca1b4622bb.slice/crio-d88b9c3dac814ade57ae0d5ba04620d30034bff657dac8de17c607feabfb5f68 WatchSource:0}: Error finding container d88b9c3dac814ade57ae0d5ba04620d30034bff657dac8de17c607feabfb5f68: Status 404 returned error can't find the container with id d88b9c3dac814ade57ae0d5ba04620d30034bff657dac8de17c607feabfb5f68 Mar 19 09:31:54.083638 master-0 kubenswrapper[15202]: I0319 09:31:54.083605 15202 generic.go:334] "Generic (PLEG): container finished" podID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" containerID="436264327abe3325ff4b8c101407c4a1d8b93ad5d90afa55d96f0c001990b3fe" exitCode=0 Mar 19 09:31:54.083856 master-0 kubenswrapper[15202]: I0319 09:31:54.083778 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerDied","Data":"436264327abe3325ff4b8c101407c4a1d8b93ad5d90afa55d96f0c001990b3fe"} Mar 19 09:31:54.084493 master-0 kubenswrapper[15202]: I0319 09:31:54.084459 15202 scope.go:117] "RemoveContainer" containerID="177c182a5bea3b40e5ef865a8fb1c4ca35bdcfe0ace11f68cfd864cda9f6ac36" Mar 19 09:31:54.085050 master-0 kubenswrapper[15202]: I0319 09:31:54.085028 15202 scope.go:117] "RemoveContainer" containerID="436264327abe3325ff4b8c101407c4a1d8b93ad5d90afa55d96f0c001990b3fe" Mar 19 09:31:54.085302 master-0 kubenswrapper[15202]: E0319 09:31:54.085281 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-68bf6ff9d6-wshz8_openshift-insights(0cb70a30-a8d1-4037-81e6-eb4f0510a234)\"" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" podUID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" Mar 19 09:31:54.085480 master-0 kubenswrapper[15202]: I0319 09:31:54.085425 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"d88b9c3dac814ade57ae0d5ba04620d30034bff657dac8de17c607feabfb5f68"} Mar 19 09:31:55.095839 master-0 kubenswrapper[15202]: I0319 09:31:55.095743 15202 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="d54182c1cde3210ea5d4203fa3ed52cd5a8e9887997ffc53053ac2db77d730d9" exitCode=0 Mar 19 09:31:55.096443 master-0 kubenswrapper[15202]: I0319 09:31:55.095863 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"d54182c1cde3210ea5d4203fa3ed52cd5a8e9887997ffc53053ac2db77d730d9"} Mar 19 09:31:55.096443 master-0 kubenswrapper[15202]: I0319 09:31:55.096340 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:31:55.096443 master-0 kubenswrapper[15202]: I0319 09:31:55.096400 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:32:00.781086 master-0 kubenswrapper[15202]: E0319 09:32:00.780977 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 19 09:32:05.812316 master-0 kubenswrapper[15202]: I0319 09:32:05.812237 15202 scope.go:117] "RemoveContainer" containerID="436264327abe3325ff4b8c101407c4a1d8b93ad5d90afa55d96f0c001990b3fe" Mar 19 09:32:05.815916 master-0 kubenswrapper[15202]: E0319 09:32:05.812601 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-68bf6ff9d6-wshz8_openshift-insights(0cb70a30-a8d1-4037-81e6-eb4f0510a234)\"" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" podUID="0cb70a30-a8d1-4037-81e6-eb4f0510a234" Mar 19 09:32:11.182619 master-0 kubenswrapper[15202]: E0319 09:32:11.182450 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="800ms" Mar 19 09:32:17.813030 master-0 kubenswrapper[15202]: I0319 09:32:17.812935 15202 scope.go:117] "RemoveContainer" containerID="436264327abe3325ff4b8c101407c4a1d8b93ad5d90afa55d96f0c001990b3fe" Mar 19 09:32:18.307232 master-0 kubenswrapper[15202]: I0319 09:32:18.307142 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-wshz8" event={"ID":"0cb70a30-a8d1-4037-81e6-eb4f0510a234","Type":"ContainerStarted","Data":"65900e4d79053304224f234bb6784fe64baeece698a3351e483fe85a99a80c7a"} Mar 19 09:32:21.984514 master-0 kubenswrapper[15202]: E0319 09:32:21.984360 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 19 09:32:24.744120 master-0 kubenswrapper[15202]: E0319 09:32:24.743875 15202 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3426635edfb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:30:42.71111775 +0000 UTC m=+360.096532576,LastTimestamp:2026-03-19 09:30:42.71111775 +0000 UTC m=+360.096532576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:32:29.100736 master-0 kubenswrapper[15202]: E0319 09:32:29.100648 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:32:29.425933 master-0 kubenswrapper[15202]: I0319 09:32:29.425873 15202 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="af372d02d065756a47a3731ef3940277aa5a524049789a1f2f34edf0d3774047" exitCode=0 Mar 19 09:32:29.425933 master-0 kubenswrapper[15202]: I0319 09:32:29.425937 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"af372d02d065756a47a3731ef3940277aa5a524049789a1f2f34edf0d3774047"} Mar 19 09:32:29.426257 master-0 kubenswrapper[15202]: I0319 09:32:29.426219 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:32:29.426257 master-0 kubenswrapper[15202]: I0319 09:32:29.426250 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:32:33.586249 master-0 kubenswrapper[15202]: E0319 09:32:33.586094 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 19 09:32:37.503201 master-0 kubenswrapper[15202]: I0319 09:32:37.503112 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/0.log" Mar 19 09:32:37.503201 master-0 kubenswrapper[15202]: I0319 09:32:37.503181 15202 generic.go:334] "Generic (PLEG): container finished" podID="e3376275-294d-446d-9b4c-930df60dba01" containerID="7d09aca9fefb402af8b2ae5b0086c54b39e7c40d8e4c2624e1555fd0e0a43d99" exitCode=1 Mar 19 09:32:37.504270 master-0 kubenswrapper[15202]: I0319 09:32:37.503244 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerDied","Data":"7d09aca9fefb402af8b2ae5b0086c54b39e7c40d8e4c2624e1555fd0e0a43d99"} Mar 19 09:32:37.504270 master-0 kubenswrapper[15202]: I0319 09:32:37.503876 15202 scope.go:117] "RemoveContainer" containerID="7d09aca9fefb402af8b2ae5b0086c54b39e7c40d8e4c2624e1555fd0e0a43d99" Mar 19 09:32:37.507540 master-0 kubenswrapper[15202]: I0319 09:32:37.507440 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-7wdws_1dd59466-0133-41fe-a648-28db73aa861b/manager/0.log" Mar 19 09:32:37.508525 master-0 kubenswrapper[15202]: I0319 09:32:37.508424 15202 generic.go:334] "Generic (PLEG): container finished" podID="1dd59466-0133-41fe-a648-28db73aa861b" containerID="a1f85bd022ed8d1a8a116afb5f7497547553a16a5ec3238e8ae6d26b7095a795" exitCode=1 Mar 19 09:32:37.508739 master-0 kubenswrapper[15202]: I0319 09:32:37.508494 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" event={"ID":"1dd59466-0133-41fe-a648-28db73aa861b","Type":"ContainerDied","Data":"a1f85bd022ed8d1a8a116afb5f7497547553a16a5ec3238e8ae6d26b7095a795"} Mar 19 09:32:37.510361 master-0 kubenswrapper[15202]: I0319 09:32:37.510327 15202 scope.go:117] "RemoveContainer" containerID="a1f85bd022ed8d1a8a116afb5f7497547553a16a5ec3238e8ae6d26b7095a795" Mar 19 09:32:38.522692 master-0 kubenswrapper[15202]: I0319 09:32:38.522581 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-7wdws_1dd59466-0133-41fe-a648-28db73aa861b/manager/0.log" Mar 19 09:32:38.523679 master-0 kubenswrapper[15202]: I0319 09:32:38.523412 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" event={"ID":"1dd59466-0133-41fe-a648-28db73aa861b","Type":"ContainerStarted","Data":"1efac4c54706ac3f965252439b3f8492ae66dd16fa96c13c13a34f5f18e8bd25"} Mar 19 09:32:38.524087 master-0 kubenswrapper[15202]: I0319 09:32:38.524003 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:32:38.528384 master-0 kubenswrapper[15202]: I0319 09:32:38.528314 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/0.log" Mar 19 09:32:38.528885 master-0 kubenswrapper[15202]: I0319 09:32:38.528425 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerStarted","Data":"995dee542cc051ca1c47aee3b5093e7beb1a53442ceb14fd42c8ed8a57e129e2"} Mar 19 09:32:41.557902 master-0 kubenswrapper[15202]: I0319 09:32:41.557745 15202 generic.go:334] "Generic (PLEG): container finished" podID="33e92e5d-61ea-45b2-b357-ebffdaebf4af" containerID="bcdb0cf22b96fe48eebdc24abb2d5b2914b32473e5a78be9ded7d96d4faa029e" exitCode=0 Mar 19 09:32:41.557902 master-0 kubenswrapper[15202]: I0319 09:32:41.557808 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" event={"ID":"33e92e5d-61ea-45b2-b357-ebffdaebf4af","Type":"ContainerDied","Data":"bcdb0cf22b96fe48eebdc24abb2d5b2914b32473e5a78be9ded7d96d4faa029e"} Mar 19 09:32:41.557902 master-0 kubenswrapper[15202]: I0319 09:32:41.557898 15202 scope.go:117] "RemoveContainer" containerID="e567b2a6970dbbdd6d360830a8ee46fec46945b28639df21bdc4828de4e3065b" Mar 19 09:32:41.559141 master-0 kubenswrapper[15202]: I0319 09:32:41.558728 15202 scope.go:117] "RemoveContainer" containerID="bcdb0cf22b96fe48eebdc24abb2d5b2914b32473e5a78be9ded7d96d4faa029e" Mar 19 09:32:42.571016 master-0 kubenswrapper[15202]: I0319 09:32:42.570922 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" event={"ID":"33e92e5d-61ea-45b2-b357-ebffdaebf4af","Type":"ContainerStarted","Data":"280b7be5da9405e8b89c23726abdbe688ae8df5688ff2d76c314fbf083ae450d"} Mar 19 09:32:42.571659 master-0 kubenswrapper[15202]: I0319 09:32:42.571572 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:32:42.576737 master-0 kubenswrapper[15202]: I0319 09:32:42.576689 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-6qck2" Mar 19 09:32:42.825681 master-0 kubenswrapper[15202]: I0319 09:32:42.825382 15202 status_manager.go:851] "Failed to get status for pod" podUID="7e9b2506-dac6-4a23-b2bf-e3ce77919857" pod="openshift-etcd/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Mar 19 09:32:45.597590 master-0 kubenswrapper[15202]: I0319 09:32:45.597527 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-pn5gg_db42b38e-294e-4016-8ac1-54126ac60de8/manager/1.log" Mar 19 09:32:45.599416 master-0 kubenswrapper[15202]: I0319 09:32:45.599367 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-pn5gg_db42b38e-294e-4016-8ac1-54126ac60de8/manager/0.log" Mar 19 09:32:45.599518 master-0 kubenswrapper[15202]: I0319 09:32:45.599448 15202 generic.go:334] "Generic (PLEG): container finished" podID="db42b38e-294e-4016-8ac1-54126ac60de8" containerID="8dc1d90f2bef6de1fc7cdb208514e8d0665da95d86860b74cd808de5a4cefbd2" exitCode=1 Mar 19 09:32:45.599557 master-0 kubenswrapper[15202]: I0319 09:32:45.599512 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerDied","Data":"8dc1d90f2bef6de1fc7cdb208514e8d0665da95d86860b74cd808de5a4cefbd2"} Mar 19 09:32:45.599591 master-0 kubenswrapper[15202]: I0319 09:32:45.599575 15202 scope.go:117] "RemoveContainer" containerID="35548679df169ca8289b897c2b3d4fef8fe6d512fd7ac178d0e99404cb991d50" Mar 19 09:32:45.600408 master-0 kubenswrapper[15202]: I0319 09:32:45.600354 15202 scope.go:117] "RemoveContainer" containerID="8dc1d90f2bef6de1fc7cdb208514e8d0665da95d86860b74cd808de5a4cefbd2" Mar 19 09:32:46.121863 master-0 kubenswrapper[15202]: I0319 09:32:46.119912 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:32:46.121863 master-0 kubenswrapper[15202]: I0319 09:32:46.120690 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:32:46.126875 master-0 kubenswrapper[15202]: I0319 09:32:46.125870 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-7wdws" Mar 19 09:32:46.616442 master-0 kubenswrapper[15202]: I0319 09:32:46.616357 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-pn5gg_db42b38e-294e-4016-8ac1-54126ac60de8/manager/1.log" Mar 19 09:32:46.617573 master-0 kubenswrapper[15202]: I0319 09:32:46.617068 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" event={"ID":"db42b38e-294e-4016-8ac1-54126ac60de8","Type":"ContainerStarted","Data":"4ae6c82ba4634c343e3732d86dbc209a377ae77ed2c8e87d7159c2f437360299"} Mar 19 09:32:46.617573 master-0 kubenswrapper[15202]: I0319 09:32:46.617514 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:32:46.787623 master-0 kubenswrapper[15202]: E0319 09:32:46.787490 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="6.4s" Mar 19 09:32:54.690878 master-0 kubenswrapper[15202]: I0319 09:32:54.690811 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/config-sync-controllers/0.log" Mar 19 09:32:54.692286 master-0 kubenswrapper[15202]: I0319 09:32:54.692241 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/cluster-cloud-controller-manager/0.log" Mar 19 09:32:54.692355 master-0 kubenswrapper[15202]: I0319 09:32:54.692316 15202 generic.go:334] "Generic (PLEG): container finished" podID="a4149b83-964c-4bd2-9769-44c7b9da0a52" containerID="735721424ad64f75cfc79f2a38adb31c107586c1ff18c9e0dc56b5f0173c3489" exitCode=1 Mar 19 09:32:54.692355 master-0 kubenswrapper[15202]: I0319 09:32:54.692340 15202 generic.go:334] "Generic (PLEG): container finished" podID="a4149b83-964c-4bd2-9769-44c7b9da0a52" containerID="203dd22278a9f40fcddde6dd79a4ed7c4144f2d1f0a477ecc5c55b58667906c6" exitCode=1 Mar 19 09:32:54.692479 master-0 kubenswrapper[15202]: I0319 09:32:54.692366 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerDied","Data":"735721424ad64f75cfc79f2a38adb31c107586c1ff18c9e0dc56b5f0173c3489"} Mar 19 09:32:54.692479 master-0 kubenswrapper[15202]: I0319 09:32:54.692439 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerDied","Data":"203dd22278a9f40fcddde6dd79a4ed7c4144f2d1f0a477ecc5c55b58667906c6"} Mar 19 09:32:54.693267 master-0 kubenswrapper[15202]: I0319 09:32:54.693233 15202 scope.go:117] "RemoveContainer" containerID="203dd22278a9f40fcddde6dd79a4ed7c4144f2d1f0a477ecc5c55b58667906c6" Mar 19 09:32:54.693267 master-0 kubenswrapper[15202]: I0319 09:32:54.693264 15202 scope.go:117] "RemoveContainer" containerID="735721424ad64f75cfc79f2a38adb31c107586c1ff18c9e0dc56b5f0173c3489" Mar 19 09:32:55.706257 master-0 kubenswrapper[15202]: I0319 09:32:55.706182 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/config-sync-controllers/0.log" Mar 19 09:32:55.707586 master-0 kubenswrapper[15202]: I0319 09:32:55.707437 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/cluster-cloud-controller-manager/0.log" Mar 19 09:32:55.707727 master-0 kubenswrapper[15202]: I0319 09:32:55.707659 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerStarted","Data":"d89369f7efdd9e9ba1bdfd5b3284573ee669c1caae5fdab46df7c2e1366e44ec"} Mar 19 09:32:55.707779 master-0 kubenswrapper[15202]: I0319 09:32:55.707758 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-rz5nt" event={"ID":"a4149b83-964c-4bd2-9769-44c7b9da0a52","Type":"ContainerStarted","Data":"f2d1f59014c375b545b5a89623e75064b1e0f7c1ce6953004bbbe545dd7971fb"} Mar 19 09:32:56.118536 master-0 kubenswrapper[15202]: I0319 09:32:56.118321 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-pn5gg" Mar 19 09:32:58.748293 master-0 kubenswrapper[15202]: E0319 09:32:58.747988 15202 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e342663618ac7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:30:42.711292615 +0000 UTC m=+360.096707441,LastTimestamp:2026-03-19 09:30:42.711292615 +0000 UTC m=+360.096707441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:33:03.189315 master-0 kubenswrapper[15202]: E0319 09:33:03.189083 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 09:33:03.430060 master-0 kubenswrapper[15202]: E0319 09:33:03.429841 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:33:03.781014 master-0 kubenswrapper[15202]: I0319 09:33:03.780954 15202 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="9402f2122798e11586314f139364e1506c67502fad7d7e31d74d016e18cc57ee" exitCode=0 Mar 19 09:33:03.781374 master-0 kubenswrapper[15202]: I0319 09:33:03.781322 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"9402f2122798e11586314f139364e1506c67502fad7d7e31d74d016e18cc57ee"} Mar 19 09:33:03.781500 master-0 kubenswrapper[15202]: I0319 09:33:03.781432 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:33:03.781500 master-0 kubenswrapper[15202]: I0319 09:33:03.781501 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:33:04.792232 master-0 kubenswrapper[15202]: I0319 09:33:04.792146 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-cscz5_dea35f60-33be-4ccc-b985-952eac3a85c0/machine-approver-controller/0.log" Mar 19 09:33:04.793256 master-0 kubenswrapper[15202]: I0319 09:33:04.792814 15202 generic.go:334] "Generic (PLEG): container finished" podID="dea35f60-33be-4ccc-b985-952eac3a85c0" containerID="41fff1387ec3d61c33f2a14d88308784724d79567cff9004f6cc0ae8d5850e73" exitCode=255 Mar 19 09:33:04.793256 master-0 kubenswrapper[15202]: I0319 09:33:04.792864 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" event={"ID":"dea35f60-33be-4ccc-b985-952eac3a85c0","Type":"ContainerDied","Data":"41fff1387ec3d61c33f2a14d88308784724d79567cff9004f6cc0ae8d5850e73"} Mar 19 09:33:04.793743 master-0 kubenswrapper[15202]: I0319 09:33:04.793717 15202 scope.go:117] "RemoveContainer" containerID="41fff1387ec3d61c33f2a14d88308784724d79567cff9004f6cc0ae8d5850e73" Mar 19 09:33:05.802894 master-0 kubenswrapper[15202]: I0319 09:33:05.802797 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-cscz5_dea35f60-33be-4ccc-b985-952eac3a85c0/machine-approver-controller/0.log" Mar 19 09:33:05.803747 master-0 kubenswrapper[15202]: I0319 09:33:05.803702 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-cscz5" event={"ID":"dea35f60-33be-4ccc-b985-952eac3a85c0","Type":"ContainerStarted","Data":"010740cf6a723b97371a45b217d2e83508be75ef7944f942d04ed86fd42faee3"} Mar 19 09:33:06.816722 master-0 kubenswrapper[15202]: I0319 09:33:06.816654 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/3.log" Mar 19 09:33:06.817761 master-0 kubenswrapper[15202]: I0319 09:33:06.817718 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/2.log" Mar 19 09:33:06.818542 master-0 kubenswrapper[15202]: I0319 09:33:06.818486 15202 generic.go:334] "Generic (PLEG): container finished" podID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" containerID="f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01" exitCode=1 Mar 19 09:33:06.822429 master-0 kubenswrapper[15202]: I0319 09:33:06.822367 15202 generic.go:334] "Generic (PLEG): container finished" podID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerID="17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc" exitCode=0 Mar 19 09:33:06.824340 master-0 kubenswrapper[15202]: I0319 09:33:06.824240 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerDied","Data":"f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01"} Mar 19 09:33:06.824440 master-0 kubenswrapper[15202]: I0319 09:33:06.824353 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" event={"ID":"0f3617ef-6143-4fb4-8c84-90ce9c6be531","Type":"ContainerDied","Data":"17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc"} Mar 19 09:33:06.824440 master-0 kubenswrapper[15202]: I0319 09:33:06.824425 15202 scope.go:117] "RemoveContainer" containerID="d01c20a752c69ad4fdbf88d6635a40cc54a638ede023acdd8e476bc823088f26" Mar 19 09:33:06.825340 master-0 kubenswrapper[15202]: I0319 09:33:06.825280 15202 scope.go:117] "RemoveContainer" containerID="17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc" Mar 19 09:33:06.825439 master-0 kubenswrapper[15202]: I0319 09:33:06.825390 15202 scope.go:117] "RemoveContainer" containerID="f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01" Mar 19 09:33:06.826162 master-0 kubenswrapper[15202]: E0319 09:33:06.826079 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-nm9nx_openshift-machine-api(cd42096c-f18d-4bb5-8a51-8761dc1edb73)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" podUID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" Mar 19 09:33:06.991731 master-0 kubenswrapper[15202]: E0319 09:33:06.991655 15202 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3617ef_6143_4fb4_8c84_90ce9c6be531.slice/crio-conmon-17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:33:07.128905 master-0 kubenswrapper[15202]: I0319 09:33:07.128801 15202 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:33:07.128905 master-0 kubenswrapper[15202]: I0319 09:33:07.128905 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:33:07.833598 master-0 kubenswrapper[15202]: I0319 09:33:07.833511 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/1.log" Mar 19 09:33:07.835651 master-0 kubenswrapper[15202]: I0319 09:33:07.835588 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/0.log" Mar 19 09:33:07.835811 master-0 kubenswrapper[15202]: I0319 09:33:07.835660 15202 generic.go:334] "Generic (PLEG): container finished" podID="e3376275-294d-446d-9b4c-930df60dba01" containerID="995dee542cc051ca1c47aee3b5093e7beb1a53442ceb14fd42c8ed8a57e129e2" exitCode=1 Mar 19 09:33:07.835811 master-0 kubenswrapper[15202]: I0319 09:33:07.835735 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerDied","Data":"995dee542cc051ca1c47aee3b5093e7beb1a53442ceb14fd42c8ed8a57e129e2"} Mar 19 09:33:07.835811 master-0 kubenswrapper[15202]: I0319 09:33:07.835778 15202 scope.go:117] "RemoveContainer" containerID="7d09aca9fefb402af8b2ae5b0086c54b39e7c40d8e4c2624e1555fd0e0a43d99" Mar 19 09:33:07.837206 master-0 kubenswrapper[15202]: I0319 09:33:07.837135 15202 scope.go:117] "RemoveContainer" containerID="995dee542cc051ca1c47aee3b5093e7beb1a53442ceb14fd42c8ed8a57e129e2" Mar 19 09:33:07.838456 master-0 kubenswrapper[15202]: E0319 09:33:07.837710 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-dzfgb_openshift-cluster-storage-operator(e3376275-294d-446d-9b4c-930df60dba01)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" podUID="e3376275-294d-446d-9b4c-930df60dba01" Mar 19 09:33:07.839133 master-0 kubenswrapper[15202]: I0319 09:33:07.839039 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" event={"ID":"0f3617ef-6143-4fb4-8c84-90ce9c6be531","Type":"ContainerStarted","Data":"2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc"} Mar 19 09:33:07.839395 master-0 kubenswrapper[15202]: I0319 09:33:07.839336 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:33:07.841762 master-0 kubenswrapper[15202]: I0319 09:33:07.841706 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/3.log" Mar 19 09:33:07.845923 master-0 kubenswrapper[15202]: I0319 09:33:07.845774 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:33:08.856193 master-0 kubenswrapper[15202]: I0319 09:33:08.856104 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/1.log" Mar 19 09:33:10.881560 master-0 kubenswrapper[15202]: I0319 09:33:10.881339 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-l8kmn_d486ce23-acf7-429a-9739-4770e1a2bf78/control-plane-machine-set-operator/0.log" Mar 19 09:33:10.881560 master-0 kubenswrapper[15202]: I0319 09:33:10.881453 15202 generic.go:334] "Generic (PLEG): container finished" podID="d486ce23-acf7-429a-9739-4770e1a2bf78" containerID="2ea52482522c190b31e0c2a767d192b95134c189dc647d21749261e6e8b31d1e" exitCode=1 Mar 19 09:33:10.882409 master-0 kubenswrapper[15202]: I0319 09:33:10.881549 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" event={"ID":"d486ce23-acf7-429a-9739-4770e1a2bf78","Type":"ContainerDied","Data":"2ea52482522c190b31e0c2a767d192b95134c189dc647d21749261e6e8b31d1e"} Mar 19 09:33:10.882409 master-0 kubenswrapper[15202]: I0319 09:33:10.882386 15202 scope.go:117] "RemoveContainer" containerID="2ea52482522c190b31e0c2a767d192b95134c189dc647d21749261e6e8b31d1e" Mar 19 09:33:11.896310 master-0 kubenswrapper[15202]: I0319 09:33:11.896189 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-l8kmn_d486ce23-acf7-429a-9739-4770e1a2bf78/control-plane-machine-set-operator/0.log" Mar 19 09:33:11.897816 master-0 kubenswrapper[15202]: I0319 09:33:11.896321 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-l8kmn" event={"ID":"d486ce23-acf7-429a-9739-4770e1a2bf78","Type":"ContainerStarted","Data":"a03d9bcb8eef829f13473d071446200197cab1949b32f33f464a2b21695e97a4"} Mar 19 09:33:12.910546 master-0 kubenswrapper[15202]: I0319 09:33:12.910235 15202 generic.go:334] "Generic (PLEG): container finished" podID="5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5" containerID="7625e1722b2e3b80ecf85f84a7ed20af518fcbf3270f5b73f90321c127613131" exitCode=0 Mar 19 09:33:12.910546 master-0 kubenswrapper[15202]: I0319 09:33:12.910344 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" event={"ID":"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5","Type":"ContainerDied","Data":"7625e1722b2e3b80ecf85f84a7ed20af518fcbf3270f5b73f90321c127613131"} Mar 19 09:33:12.911592 master-0 kubenswrapper[15202]: I0319 09:33:12.911541 15202 scope.go:117] "RemoveContainer" containerID="7625e1722b2e3b80ecf85f84a7ed20af518fcbf3270f5b73f90321c127613131" Mar 19 09:33:13.922101 master-0 kubenswrapper[15202]: I0319 09:33:13.921974 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-r75tv" event={"ID":"5c2f6f98-3bbe-42cc-81c2-f498b17e4ef5","Type":"ContainerStarted","Data":"e4eaf094c1ada309b486ca0386e7284e53f774309e85d947cd8c8a29e8f4f126"} Mar 19 09:33:16.946996 master-0 kubenswrapper[15202]: I0319 09:33:16.946906 15202 generic.go:334] "Generic (PLEG): container finished" podID="56e4b90a881a688f81bb1f315628150f" containerID="ce05bf6050be57f679d8808c21c216584cff22bbd6c73ce590810a791f17b78b" exitCode=0 Mar 19 09:33:16.946996 master-0 kubenswrapper[15202]: I0319 09:33:16.946982 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerDied","Data":"ce05bf6050be57f679d8808c21c216584cff22bbd6c73ce590810a791f17b78b"} Mar 19 09:33:16.947982 master-0 kubenswrapper[15202]: I0319 09:33:16.947947 15202 scope.go:117] "RemoveContainer" containerID="ce05bf6050be57f679d8808c21c216584cff22bbd6c73ce590810a791f17b78b" Mar 19 09:33:17.811758 master-0 kubenswrapper[15202]: I0319 09:33:17.811694 15202 scope.go:117] "RemoveContainer" containerID="f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01" Mar 19 09:33:17.812043 master-0 kubenswrapper[15202]: E0319 09:33:17.811975 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-nm9nx_openshift-machine-api(cd42096c-f18d-4bb5-8a51-8761dc1edb73)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" podUID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" Mar 19 09:33:17.955504 master-0 kubenswrapper[15202]: I0319 09:33:17.955419 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:33:17.956317 master-0 kubenswrapper[15202]: I0319 09:33:17.956237 15202 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" exitCode=1 Mar 19 09:33:17.956379 master-0 kubenswrapper[15202]: I0319 09:33:17.956310 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a"} Mar 19 09:33:17.956978 master-0 kubenswrapper[15202]: I0319 09:33:17.956953 15202 scope.go:117] "RemoveContainer" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" Mar 19 09:33:17.961514 master-0 kubenswrapper[15202]: I0319 09:33:17.961002 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"e6ecc8227a05dd132bbd77052651817fa06a0d1b4f6b5e8fb3f47ae909216004"} Mar 19 09:33:18.357181 master-0 kubenswrapper[15202]: I0319 09:33:18.357037 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:18.357181 master-0 kubenswrapper[15202]: I0319 09:33:18.357171 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:18.973576 master-0 kubenswrapper[15202]: I0319 09:33:18.973348 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-qx75g_f93b8728-4a33-4ee4-b7c6-cff7d7995953/machine-api-operator/1.log" Mar 19 09:33:18.974432 master-0 kubenswrapper[15202]: I0319 09:33:18.974158 15202 generic.go:334] "Generic (PLEG): container finished" podID="f93b8728-4a33-4ee4-b7c6-cff7d7995953" containerID="82c9db947e048d06ab325d28a9da9998cc2e567351adaf60dac34444baffd037" exitCode=255 Mar 19 09:33:18.974432 master-0 kubenswrapper[15202]: I0319 09:33:18.974241 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" event={"ID":"f93b8728-4a33-4ee4-b7c6-cff7d7995953","Type":"ContainerDied","Data":"82c9db947e048d06ab325d28a9da9998cc2e567351adaf60dac34444baffd037"} Mar 19 09:33:18.975305 master-0 kubenswrapper[15202]: I0319 09:33:18.975256 15202 scope.go:117] "RemoveContainer" containerID="82c9db947e048d06ab325d28a9da9998cc2e567351adaf60dac34444baffd037" Mar 19 09:33:18.978412 master-0 kubenswrapper[15202]: I0319 09:33:18.978367 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:33:18.978936 master-0 kubenswrapper[15202]: I0319 09:33:18.978822 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682"} Mar 19 09:33:18.979798 master-0 kubenswrapper[15202]: I0319 09:33:18.979677 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:19.989587 master-0 kubenswrapper[15202]: I0319 09:33:19.989449 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-qx75g_f93b8728-4a33-4ee4-b7c6-cff7d7995953/machine-api-operator/1.log" Mar 19 09:33:19.991116 master-0 kubenswrapper[15202]: I0319 09:33:19.991063 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-qx75g" event={"ID":"f93b8728-4a33-4ee4-b7c6-cff7d7995953","Type":"ContainerStarted","Data":"932f5755271fe68c4515bef8609cd9f80440bee3fe9984d85e6b86c95d6b348a"} Mar 19 09:33:19.991116 master-0 kubenswrapper[15202]: I0319 09:33:19.991111 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:33:20.190024 master-0 kubenswrapper[15202]: E0319 09:33:20.189939 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 19 09:33:21.358051 master-0 kubenswrapper[15202]: I0319 09:33:21.357970 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:33:21.358657 master-0 kubenswrapper[15202]: I0319 09:33:21.358053 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:33:21.812382 master-0 kubenswrapper[15202]: I0319 09:33:21.812293 15202 scope.go:117] "RemoveContainer" containerID="995dee542cc051ca1c47aee3b5093e7beb1a53442ceb14fd42c8ed8a57e129e2" Mar 19 09:33:22.011044 master-0 kubenswrapper[15202]: I0319 09:33:22.010989 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/1.log" Mar 19 09:33:22.011203 master-0 kubenswrapper[15202]: I0319 09:33:22.011059 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerStarted","Data":"154df14d34d15bfd2206a11f4d3c358ac1208233fa4216fc1496a3442d25b1ec"} Mar 19 09:33:28.813381 master-0 kubenswrapper[15202]: I0319 09:33:28.813267 15202 scope.go:117] "RemoveContainer" containerID="f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01" Mar 19 09:33:29.068897 master-0 kubenswrapper[15202]: I0319 09:33:29.068771 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/3.log" Mar 19 09:33:29.069491 master-0 kubenswrapper[15202]: I0319 09:33:29.069413 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerStarted","Data":"cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8"} Mar 19 09:33:31.357369 master-0 kubenswrapper[15202]: I0319 09:33:31.357264 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:33:31.358135 master-0 kubenswrapper[15202]: I0319 09:33:31.357374 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:33:32.751119 master-0 kubenswrapper[15202]: E0319 09:33:32.750963 15202 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{network-node-identity-kqb2h.189e34324f08a536 openshift-network-node-identity 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-network-node-identity,Name:network-node-identity-kqb2h,UID:b2898746-6827-41d9-ac88-64206cb84ac9,APIVersion:v1,ResourceVersion:9549,FieldPath:spec.containers{approver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:31:33.90952991 +0000 UTC m=+411.294944726,LastTimestamp:2026-03-19 09:31:33.90952991 +0000 UTC m=+411.294944726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:33:37.191311 master-0 kubenswrapper[15202]: E0319 09:33:37.191195 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 09:33:37.784305 master-0 kubenswrapper[15202]: E0319 09:33:37.784232 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:33:38.162535 master-0 kubenswrapper[15202]: I0319 09:33:38.162479 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"36e96185314bc409d0b39b088b9a32f3aa0baaccb31b95b2845046e9a830dca0"} Mar 19 09:33:39.176512 master-0 kubenswrapper[15202]: I0319 09:33:39.176345 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"8cdce7bcb052497478b5329edbdbb1d524bd6261c77b2c94364466f27f294b36"} Mar 19 09:33:39.176512 master-0 kubenswrapper[15202]: I0319 09:33:39.176402 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"f07f3688049f5062d0017d0b312d2c3235f9f59de530a7b748120e6eac17677f"} Mar 19 09:33:39.176512 master-0 kubenswrapper[15202]: I0319 09:33:39.176416 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"407807115c21f88bd9b92a2d5917005c34ac4093db2695dcbaae7267e4bfece0"} Mar 19 09:33:39.176512 master-0 kubenswrapper[15202]: I0319 09:33:39.176427 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"e022a9e7e25ec3b1a09fdb81bb043c1885bf07a29bf34ecc021e2eb410045788"} Mar 19 09:33:39.177274 master-0 kubenswrapper[15202]: I0319 09:33:39.176723 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:33:39.177274 master-0 kubenswrapper[15202]: I0319 09:33:39.176746 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:33:41.357804 master-0 kubenswrapper[15202]: I0319 09:33:41.357715 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:33:41.357804 master-0 kubenswrapper[15202]: I0319 09:33:41.357803 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:33:41.358638 master-0 kubenswrapper[15202]: I0319 09:33:41.357861 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:41.358678 master-0 kubenswrapper[15202]: I0319 09:33:41.358645 15202 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"e6ecc8227a05dd132bbd77052651817fa06a0d1b4f6b5e8fb3f47ae909216004"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 09:33:41.358791 master-0 kubenswrapper[15202]: I0319 09:33:41.358757 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" containerID="cri-o://e6ecc8227a05dd132bbd77052651817fa06a0d1b4f6b5e8fb3f47ae909216004" gracePeriod=30 Mar 19 09:33:42.223513 master-0 kubenswrapper[15202]: I0319 09:33:42.223389 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/cluster-policy-controller/1.log" Mar 19 09:33:42.226447 master-0 kubenswrapper[15202]: I0319 09:33:42.226399 15202 generic.go:334] "Generic (PLEG): container finished" podID="56e4b90a881a688f81bb1f315628150f" containerID="e6ecc8227a05dd132bbd77052651817fa06a0d1b4f6b5e8fb3f47ae909216004" exitCode=255 Mar 19 09:33:42.226568 master-0 kubenswrapper[15202]: I0319 09:33:42.226456 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerDied","Data":"e6ecc8227a05dd132bbd77052651817fa06a0d1b4f6b5e8fb3f47ae909216004"} Mar 19 09:33:42.226568 master-0 kubenswrapper[15202]: I0319 09:33:42.226508 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"56e4b90a881a688f81bb1f315628150f","Type":"ContainerStarted","Data":"ebf1733b19e744225a9e8c315e74e98e73b2be2483625475391f4ed66449d3f3"} Mar 19 09:33:42.226568 master-0 kubenswrapper[15202]: I0319 09:33:42.226530 15202 scope.go:117] "RemoveContainer" containerID="ce05bf6050be57f679d8808c21c216584cff22bbd6c73ce590810a791f17b78b" Mar 19 09:33:42.828187 master-0 kubenswrapper[15202]: I0319 09:33:42.828116 15202 status_manager.go:851] "Failed to get status for pod" podUID="b2898746-6827-41d9-ac88-64206cb84ac9" pod="openshift-network-node-identity/network-node-identity-kqb2h" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods network-node-identity-kqb2h)" Mar 19 09:33:43.236015 master-0 kubenswrapper[15202]: I0319 09:33:43.235966 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/cluster-policy-controller/1.log" Mar 19 09:33:43.839998 master-0 kubenswrapper[15202]: I0319 09:33:43.839921 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 09:33:43.840610 master-0 kubenswrapper[15202]: I0319 09:33:43.840047 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 09:33:48.356258 master-0 kubenswrapper[15202]: I0319 09:33:48.356127 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:48.357395 master-0 kubenswrapper[15202]: I0319 09:33:48.357365 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:51.357931 master-0 kubenswrapper[15202]: I0319 09:33:51.357854 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:33:51.358513 master-0 kubenswrapper[15202]: I0319 09:33:51.357951 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:33:52.312196 master-0 kubenswrapper[15202]: I0319 09:33:52.312137 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/2.log" Mar 19 09:33:52.312895 master-0 kubenswrapper[15202]: I0319 09:33:52.312846 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/1.log" Mar 19 09:33:52.312971 master-0 kubenswrapper[15202]: I0319 09:33:52.312922 15202 generic.go:334] "Generic (PLEG): container finished" podID="e3376275-294d-446d-9b4c-930df60dba01" containerID="154df14d34d15bfd2206a11f4d3c358ac1208233fa4216fc1496a3442d25b1ec" exitCode=1 Mar 19 09:33:52.313032 master-0 kubenswrapper[15202]: I0319 09:33:52.312986 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerDied","Data":"154df14d34d15bfd2206a11f4d3c358ac1208233fa4216fc1496a3442d25b1ec"} Mar 19 09:33:52.313069 master-0 kubenswrapper[15202]: I0319 09:33:52.313044 15202 scope.go:117] "RemoveContainer" containerID="995dee542cc051ca1c47aee3b5093e7beb1a53442ceb14fd42c8ed8a57e129e2" Mar 19 09:33:52.315848 master-0 kubenswrapper[15202]: I0319 09:33:52.315801 15202 scope.go:117] "RemoveContainer" containerID="154df14d34d15bfd2206a11f4d3c358ac1208233fa4216fc1496a3442d25b1ec" Mar 19 09:33:52.316107 master-0 kubenswrapper[15202]: E0319 09:33:52.316070 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-dzfgb_openshift-cluster-storage-operator(e3376275-294d-446d-9b4c-930df60dba01)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" podUID="e3376275-294d-446d-9b4c-930df60dba01" Mar 19 09:33:53.321354 master-0 kubenswrapper[15202]: I0319 09:33:53.321292 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/2.log" Mar 19 09:33:53.861861 master-0 kubenswrapper[15202]: I0319 09:33:53.861796 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 09:33:58.368281 master-0 kubenswrapper[15202]: I0319 09:33:58.368221 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:58.373723 master-0 kubenswrapper[15202]: I0319 09:33:58.373684 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:33:58.860896 master-0 kubenswrapper[15202]: I0319 09:33:58.860838 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 09:34:05.812988 master-0 kubenswrapper[15202]: I0319 09:34:05.812906 15202 scope.go:117] "RemoveContainer" containerID="154df14d34d15bfd2206a11f4d3c358ac1208233fa4216fc1496a3442d25b1ec" Mar 19 09:34:05.813748 master-0 kubenswrapper[15202]: E0319 09:34:05.813377 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-dzfgb_openshift-cluster-storage-operator(e3376275-294d-446d-9b4c-930df60dba01)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" podUID="e3376275-294d-446d-9b4c-930df60dba01" Mar 19 09:34:11.760167 master-0 kubenswrapper[15202]: I0319 09:34:11.760085 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:34:13.179621 master-0 kubenswrapper[15202]: E0319 09:34:13.179548 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:34:13.495090 master-0 kubenswrapper[15202]: I0319 09:34:13.494920 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:34:13.495090 master-0 kubenswrapper[15202]: I0319 09:34:13.494988 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:34:16.813386 master-0 kubenswrapper[15202]: I0319 09:34:16.813324 15202 scope.go:117] "RemoveContainer" containerID="154df14d34d15bfd2206a11f4d3c358ac1208233fa4216fc1496a3442d25b1ec" Mar 19 09:34:17.537806 master-0 kubenswrapper[15202]: I0319 09:34:17.537748 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/2.log" Mar 19 09:34:17.538150 master-0 kubenswrapper[15202]: I0319 09:34:17.537838 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-dzfgb" event={"ID":"e3376275-294d-446d-9b4c-930df60dba01","Type":"ContainerStarted","Data":"a12e207512a5a0118caf10e84c0b2b097f23b9e0350b1a106d25446ba1b5142e"} Mar 19 09:34:29.650201 master-0 kubenswrapper[15202]: I0319 09:34:29.650119 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/4.log" Mar 19 09:34:29.652640 master-0 kubenswrapper[15202]: I0319 09:34:29.652560 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/3.log" Mar 19 09:34:29.653803 master-0 kubenswrapper[15202]: I0319 09:34:29.653716 15202 generic.go:334] "Generic (PLEG): container finished" podID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" containerID="cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8" exitCode=1 Mar 19 09:34:29.653880 master-0 kubenswrapper[15202]: I0319 09:34:29.653811 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerDied","Data":"cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8"} Mar 19 09:34:29.654040 master-0 kubenswrapper[15202]: I0319 09:34:29.653996 15202 scope.go:117] "RemoveContainer" containerID="f64562d1a314fe86efb94398874161af93b4e04c9b76788819aa9c6427288e01" Mar 19 09:34:29.655274 master-0 kubenswrapper[15202]: I0319 09:34:29.655202 15202 scope.go:117] "RemoveContainer" containerID="cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8" Mar 19 09:34:29.655835 master-0 kubenswrapper[15202]: E0319 09:34:29.655779 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-nm9nx_openshift-machine-api(cd42096c-f18d-4bb5-8a51-8761dc1edb73)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" podUID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" Mar 19 09:34:30.663764 master-0 kubenswrapper[15202]: I0319 09:34:30.663698 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/4.log" Mar 19 09:34:42.830158 master-0 kubenswrapper[15202]: I0319 09:34:42.830089 15202 status_manager.go:851] "Failed to get status for pod" podUID="094204df314fe45bd5af12ca1b4622bb" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 19 09:34:44.812284 master-0 kubenswrapper[15202]: I0319 09:34:44.812214 15202 scope.go:117] "RemoveContainer" containerID="cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8" Mar 19 09:34:44.812972 master-0 kubenswrapper[15202]: E0319 09:34:44.812528 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-nm9nx_openshift-machine-api(cd42096c-f18d-4bb5-8a51-8761dc1edb73)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" podUID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" Mar 19 09:34:47.498625 master-0 kubenswrapper[15202]: E0319 09:34:47.498514 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 09:34:59.812562 master-0 kubenswrapper[15202]: I0319 09:34:59.812495 15202 scope.go:117] "RemoveContainer" containerID="cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8" Mar 19 09:34:59.813289 master-0 kubenswrapper[15202]: E0319 09:34:59.812772 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-nm9nx_openshift-machine-api(cd42096c-f18d-4bb5-8a51-8761dc1edb73)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" podUID="cd42096c-f18d-4bb5-8a51-8761dc1edb73" Mar 19 09:35:10.813764 master-0 kubenswrapper[15202]: I0319 09:35:10.813676 15202 scope.go:117] "RemoveContainer" containerID="cced7eea2b6d3f1e8b74c9b95b6c84c5d6a6a67257109c1fb5b4bb829cf963c8" Mar 19 09:35:11.014529 master-0 kubenswrapper[15202]: I0319 09:35:11.014436 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/4.log" Mar 19 09:35:11.015581 master-0 kubenswrapper[15202]: I0319 09:35:11.015299 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-nm9nx" event={"ID":"cd42096c-f18d-4bb5-8a51-8761dc1edb73","Type":"ContainerStarted","Data":"97c57e3d1069f9ab33eb42f329fe1020b7f84827b15b9bdb242511fa46937541"} Mar 19 09:35:26.162094 master-0 kubenswrapper[15202]: I0319 09:35:26.161983 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:35:26.164233 master-0 kubenswrapper[15202]: E0319 09:35:26.162401 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9b2506-dac6-4a23-b2bf-e3ce77919857" containerName="installer" Mar 19 09:35:26.164233 master-0 kubenswrapper[15202]: I0319 09:35:26.162415 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9b2506-dac6-4a23-b2bf-e3ce77919857" containerName="installer" Mar 19 09:35:26.164233 master-0 kubenswrapper[15202]: I0319 09:35:26.162603 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9b2506-dac6-4a23-b2bf-e3ce77919857" containerName="installer" Mar 19 09:35:26.164233 master-0 kubenswrapper[15202]: I0319 09:35:26.163517 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.167158 master-0 kubenswrapper[15202]: I0319 09:35:26.167124 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 09:35:26.168123 master-0 kubenswrapper[15202]: I0319 09:35:26.168092 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-jh786" Mar 19 09:35:26.169308 master-0 kubenswrapper[15202]: I0319 09:35:26.169248 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-var-lock\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.169443 master-0 kubenswrapper[15202]: I0319 09:35:26.169346 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.169649 master-0 kubenswrapper[15202]: I0319 09:35:26.169615 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kube-api-access\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.176782 master-0 kubenswrapper[15202]: I0319 09:35:26.176726 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:35:26.271574 master-0 kubenswrapper[15202]: I0319 09:35:26.271429 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.271926 master-0 kubenswrapper[15202]: I0319 09:35:26.271598 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kube-api-access\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.271926 master-0 kubenswrapper[15202]: I0319 09:35:26.271716 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-var-lock\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.271926 master-0 kubenswrapper[15202]: I0319 09:35:26.271821 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-var-lock\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.272156 master-0 kubenswrapper[15202]: I0319 09:35:26.272081 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.290845 master-0 kubenswrapper[15202]: I0319 09:35:26.290776 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kube-api-access\") pod \"installer-5-master-0\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:26.521237 master-0 kubenswrapper[15202]: I0319 09:35:26.521149 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:35:27.053924 master-0 kubenswrapper[15202]: W0319 09:35:27.053850 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0bbde12c_dfc2_434e_8bf4_f8cb88316a25.slice/crio-24f778520f5960effd837e92f64c900a5282374aa679c7da5a3181daceef1b4f WatchSource:0}: Error finding container 24f778520f5960effd837e92f64c900a5282374aa679c7da5a3181daceef1b4f: Status 404 returned error can't find the container with id 24f778520f5960effd837e92f64c900a5282374aa679c7da5a3181daceef1b4f Mar 19 09:35:27.054890 master-0 kubenswrapper[15202]: I0319 09:35:27.054831 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 09:35:27.165777 master-0 kubenswrapper[15202]: I0319 09:35:27.165698 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0bbde12c-dfc2-434e-8bf4-f8cb88316a25","Type":"ContainerStarted","Data":"24f778520f5960effd837e92f64c900a5282374aa679c7da5a3181daceef1b4f"} Mar 19 09:35:28.178579 master-0 kubenswrapper[15202]: I0319 09:35:28.178514 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0bbde12c-dfc2-434e-8bf4-f8cb88316a25","Type":"ContainerStarted","Data":"1b110177afb9f355c80af76f32a5de6fefcf2666818676e5a1c8fc118db1e735"} Mar 19 09:35:28.209971 master-0 kubenswrapper[15202]: I0319 09:35:28.209883 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=2.209866422 podStartE2EDuration="2.209866422s" podCreationTimestamp="2026-03-19 09:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:28.206053371 +0000 UTC m=+645.591468197" watchObservedRunningTime="2026-03-19 09:35:28.209866422 +0000 UTC m=+645.595281238" Mar 19 09:35:28.828171 master-0 kubenswrapper[15202]: I0319 09:35:28.828086 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:35:28.829940 master-0 kubenswrapper[15202]: I0319 09:35:28.829894 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:28.832481 master-0 kubenswrapper[15202]: I0319 09:35:28.832424 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:35:28.832553 master-0 kubenswrapper[15202]: I0319 09:35:28.832465 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wv2vd" Mar 19 09:35:28.883185 master-0 kubenswrapper[15202]: I0319 09:35:28.883088 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:35:28.941425 master-0 kubenswrapper[15202]: I0319 09:35:28.941309 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:28.942002 master-0 kubenswrapper[15202]: I0319 09:35:28.941444 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-var-lock\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:28.942002 master-0 kubenswrapper[15202]: I0319 09:35:28.941512 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af31aaf4-3e95-4505-9f5c-af88c1097638-kube-api-access\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.044026 master-0 kubenswrapper[15202]: I0319 09:35:29.043434 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.044026 master-0 kubenswrapper[15202]: I0319 09:35:29.043635 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.044026 master-0 kubenswrapper[15202]: I0319 09:35:29.043673 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-var-lock\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.044026 master-0 kubenswrapper[15202]: I0319 09:35:29.043808 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-var-lock\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.044026 master-0 kubenswrapper[15202]: I0319 09:35:29.043809 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af31aaf4-3e95-4505-9f5c-af88c1097638-kube-api-access\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.076565 master-0 kubenswrapper[15202]: I0319 09:35:29.076453 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af31aaf4-3e95-4505-9f5c-af88c1097638-kube-api-access\") pod \"installer-3-master-0\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.189625 master-0 kubenswrapper[15202]: I0319 09:35:29.189526 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:35:29.752392 master-0 kubenswrapper[15202]: I0319 09:35:29.752336 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 09:35:29.810171 master-0 kubenswrapper[15202]: I0319 09:35:29.809245 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:35:29.813870 master-0 kubenswrapper[15202]: I0319 09:35:29.813786 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.816672 master-0 kubenswrapper[15202]: I0319 09:35:29.816634 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-w5d24" Mar 19 09:35:29.818346 master-0 kubenswrapper[15202]: I0319 09:35:29.818312 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:35:29.866348 master-0 kubenswrapper[15202]: I0319 09:35:29.866250 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-var-lock\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.867214 master-0 kubenswrapper[15202]: I0319 09:35:29.867150 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kube-api-access\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.867478 master-0 kubenswrapper[15202]: I0319 09:35:29.867432 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.943655 master-0 kubenswrapper[15202]: I0319 09:35:29.943575 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:35:29.971487 master-0 kubenswrapper[15202]: I0319 09:35:29.971404 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-var-lock\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.972187 master-0 kubenswrapper[15202]: I0319 09:35:29.972161 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kube-api-access\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.974790 master-0 kubenswrapper[15202]: I0319 09:35:29.972757 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-var-lock\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.974790 master-0 kubenswrapper[15202]: I0319 09:35:29.973688 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.974992 master-0 kubenswrapper[15202]: I0319 09:35:29.974968 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:29.990939 master-0 kubenswrapper[15202]: I0319 09:35:29.990730 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kube-api-access\") pod \"installer-5-master-0\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:30.166960 master-0 kubenswrapper[15202]: I0319 09:35:30.166871 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:35:30.200883 master-0 kubenswrapper[15202]: I0319 09:35:30.200816 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"af31aaf4-3e95-4505-9f5c-af88c1097638","Type":"ContainerStarted","Data":"04259f2cb7a617207ea80e9b9b1c811da9e90edf3ac12d3f19e7a53df1d01604"} Mar 19 09:35:30.631362 master-0 kubenswrapper[15202]: I0319 09:35:30.631282 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:35:31.213307 master-0 kubenswrapper[15202]: I0319 09:35:31.213218 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84e80953-2ee6-4580-b4a4-0f85fdacaf8f","Type":"ContainerStarted","Data":"3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f"} Mar 19 09:35:31.213307 master-0 kubenswrapper[15202]: I0319 09:35:31.213277 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84e80953-2ee6-4580-b4a4-0f85fdacaf8f","Type":"ContainerStarted","Data":"1b9c4bafde036e8f5fac7de0c4b531440c8cc03cbc48b0cdb3a428633297f388"} Mar 19 09:35:31.216377 master-0 kubenswrapper[15202]: I0319 09:35:31.216308 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"af31aaf4-3e95-4505-9f5c-af88c1097638","Type":"ContainerStarted","Data":"4b2643abf6aa4d1022a5d527aa118dca32ff4506164827d1ac25495b362726cf"} Mar 19 09:35:31.327660 master-0 kubenswrapper[15202]: I0319 09:35:31.325369 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=2.325333397 podStartE2EDuration="2.325333397s" podCreationTimestamp="2026-03-19 09:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:31.321403914 +0000 UTC m=+648.706818770" watchObservedRunningTime="2026-03-19 09:35:31.325333397 +0000 UTC m=+648.710748223" Mar 19 09:35:31.603631 master-0 kubenswrapper[15202]: I0319 09:35:31.603355 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=3.603307934 podStartE2EDuration="3.603307934s" podCreationTimestamp="2026-03-19 09:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:31.600395394 +0000 UTC m=+648.985810250" watchObservedRunningTime="2026-03-19 09:35:31.603307934 +0000 UTC m=+648.988722780" Mar 19 09:35:36.350964 master-0 kubenswrapper[15202]: I0319 09:35:36.350893 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv"] Mar 19 09:35:36.351707 master-0 kubenswrapper[15202]: I0319 09:35:36.351140 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" podUID="a2f7f5e9-658c-44a4-a42a-544247b24195" containerName="route-controller-manager" containerID="cri-o://724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676" gracePeriod=30 Mar 19 09:35:36.361487 master-0 kubenswrapper[15202]: I0319 09:35:36.361380 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67d4b5c54d-v56p6"] Mar 19 09:35:36.365849 master-0 kubenswrapper[15202]: I0319 09:35:36.365752 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" containerID="cri-o://2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc" gracePeriod=30 Mar 19 09:35:36.394180 master-0 kubenswrapper[15202]: I0319 09:35:36.394104 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5455ddcb95-p88pn"] Mar 19 09:35:36.965987 master-0 kubenswrapper[15202]: I0319 09:35:36.965926 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:35:37.014866 master-0 kubenswrapper[15202]: I0319 09:35:37.014658 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:35:37.026245 master-0 kubenswrapper[15202]: I0319 09:35:37.025547 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f7f5e9-658c-44a4-a42a-544247b24195-serving-cert\") pod \"a2f7f5e9-658c-44a4-a42a-544247b24195\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " Mar 19 09:35:37.026245 master-0 kubenswrapper[15202]: I0319 09:35:37.025749 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-client-ca\") pod \"a2f7f5e9-658c-44a4-a42a-544247b24195\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " Mar 19 09:35:37.026245 master-0 kubenswrapper[15202]: I0319 09:35:37.025819 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-config\") pod \"a2f7f5e9-658c-44a4-a42a-544247b24195\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " Mar 19 09:35:37.028236 master-0 kubenswrapper[15202]: I0319 09:35:37.026102 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdmhj\" (UniqueName: \"kubernetes.io/projected/a2f7f5e9-658c-44a4-a42a-544247b24195-kube-api-access-bdmhj\") pod \"a2f7f5e9-658c-44a4-a42a-544247b24195\" (UID: \"a2f7f5e9-658c-44a4-a42a-544247b24195\") " Mar 19 09:35:37.028236 master-0 kubenswrapper[15202]: I0319 09:35:37.026938 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2f7f5e9-658c-44a4-a42a-544247b24195" (UID: "a2f7f5e9-658c-44a4-a42a-544247b24195"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:35:37.028236 master-0 kubenswrapper[15202]: I0319 09:35:37.027506 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-config" (OuterVolumeSpecName: "config") pod "a2f7f5e9-658c-44a4-a42a-544247b24195" (UID: "a2f7f5e9-658c-44a4-a42a-544247b24195"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:35:37.029207 master-0 kubenswrapper[15202]: I0319 09:35:37.029136 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f7f5e9-658c-44a4-a42a-544247b24195-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2f7f5e9-658c-44a4-a42a-544247b24195" (UID: "a2f7f5e9-658c-44a4-a42a-544247b24195"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:35:37.030255 master-0 kubenswrapper[15202]: I0319 09:35:37.030198 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f7f5e9-658c-44a4-a42a-544247b24195-kube-api-access-bdmhj" (OuterVolumeSpecName: "kube-api-access-bdmhj") pod "a2f7f5e9-658c-44a4-a42a-544247b24195" (UID: "a2f7f5e9-658c-44a4-a42a-544247b24195"). InnerVolumeSpecName "kube-api-access-bdmhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.130264 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-proxy-ca-bundles\") pod \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.130391 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-client-ca\") pod \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.130512 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-config\") pod \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.130567 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2x8\" (UniqueName: \"kubernetes.io/projected/0f3617ef-6143-4fb4-8c84-90ce9c6be531-kube-api-access-wr2x8\") pod \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.130653 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3617ef-6143-4fb4-8c84-90ce9c6be531-serving-cert\") pod \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\" (UID: \"0f3617ef-6143-4fb4-8c84-90ce9c6be531\") " Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.131252 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdmhj\" (UniqueName: \"kubernetes.io/projected/a2f7f5e9-658c-44a4-a42a-544247b24195-kube-api-access-bdmhj\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.131274 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f7f5e9-658c-44a4-a42a-544247b24195-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.131286 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.132020 master-0 kubenswrapper[15202]: I0319 09:35:37.131298 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f7f5e9-658c-44a4-a42a-544247b24195-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.134232 master-0 kubenswrapper[15202]: I0319 09:35:37.134047 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f3617ef-6143-4fb4-8c84-90ce9c6be531" (UID: "0f3617ef-6143-4fb4-8c84-90ce9c6be531"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:35:37.134733 master-0 kubenswrapper[15202]: I0319 09:35:37.134591 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-config" (OuterVolumeSpecName: "config") pod "0f3617ef-6143-4fb4-8c84-90ce9c6be531" (UID: "0f3617ef-6143-4fb4-8c84-90ce9c6be531"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:35:37.134733 master-0 kubenswrapper[15202]: I0319 09:35:37.134705 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0f3617ef-6143-4fb4-8c84-90ce9c6be531" (UID: "0f3617ef-6143-4fb4-8c84-90ce9c6be531"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:35:37.137315 master-0 kubenswrapper[15202]: I0319 09:35:37.137200 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3617ef-6143-4fb4-8c84-90ce9c6be531-kube-api-access-wr2x8" (OuterVolumeSpecName: "kube-api-access-wr2x8") pod "0f3617ef-6143-4fb4-8c84-90ce9c6be531" (UID: "0f3617ef-6143-4fb4-8c84-90ce9c6be531"). InnerVolumeSpecName "kube-api-access-wr2x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:35:37.141252 master-0 kubenswrapper[15202]: I0319 09:35:37.141148 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3617ef-6143-4fb4-8c84-90ce9c6be531-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f3617ef-6143-4fb4-8c84-90ce9c6be531" (UID: "0f3617ef-6143-4fb4-8c84-90ce9c6be531"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:35:37.233702 master-0 kubenswrapper[15202]: I0319 09:35:37.233507 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3617ef-6143-4fb4-8c84-90ce9c6be531-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.233702 master-0 kubenswrapper[15202]: I0319 09:35:37.233622 15202 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.233702 master-0 kubenswrapper[15202]: I0319 09:35:37.233638 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.233702 master-0 kubenswrapper[15202]: I0319 09:35:37.233650 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3617ef-6143-4fb4-8c84-90ce9c6be531-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.233702 master-0 kubenswrapper[15202]: I0319 09:35:37.233663 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2x8\" (UniqueName: \"kubernetes.io/projected/0f3617ef-6143-4fb4-8c84-90ce9c6be531-kube-api-access-wr2x8\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:37.272858 master-0 kubenswrapper[15202]: I0319 09:35:37.272762 15202 generic.go:334] "Generic (PLEG): container finished" podID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerID="2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc" exitCode=0 Mar 19 09:35:37.273288 master-0 kubenswrapper[15202]: I0319 09:35:37.272927 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" event={"ID":"0f3617ef-6143-4fb4-8c84-90ce9c6be531","Type":"ContainerDied","Data":"2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc"} Mar 19 09:35:37.273288 master-0 kubenswrapper[15202]: I0319 09:35:37.272972 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" event={"ID":"0f3617ef-6143-4fb4-8c84-90ce9c6be531","Type":"ContainerDied","Data":"85160d2661d6ff156f0b320837a285934abdff11c671b179d0f756da855bc914"} Mar 19 09:35:37.273288 master-0 kubenswrapper[15202]: I0319 09:35:37.273000 15202 scope.go:117] "RemoveContainer" containerID="2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc" Mar 19 09:35:37.275580 master-0 kubenswrapper[15202]: I0319 09:35:37.273637 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67d4b5c54d-v56p6" Mar 19 09:35:37.278020 master-0 kubenswrapper[15202]: I0319 09:35:37.277983 15202 generic.go:334] "Generic (PLEG): container finished" podID="a2f7f5e9-658c-44a4-a42a-544247b24195" containerID="724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676" exitCode=0 Mar 19 09:35:37.278065 master-0 kubenswrapper[15202]: I0319 09:35:37.278033 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" event={"ID":"a2f7f5e9-658c-44a4-a42a-544247b24195","Type":"ContainerDied","Data":"724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676"} Mar 19 09:35:37.278098 master-0 kubenswrapper[15202]: I0319 09:35:37.278068 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" event={"ID":"a2f7f5e9-658c-44a4-a42a-544247b24195","Type":"ContainerDied","Data":"c8019030abd57d8f8a1e32054a156375adf80661f1587a2296060eba25b966e1"} Mar 19 09:35:37.278098 master-0 kubenswrapper[15202]: I0319 09:35:37.278065 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv" Mar 19 09:35:37.296000 master-0 kubenswrapper[15202]: I0319 09:35:37.295959 15202 scope.go:117] "RemoveContainer" containerID="17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc" Mar 19 09:35:37.339829 master-0 kubenswrapper[15202]: I0319 09:35:37.339751 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67d4b5c54d-v56p6"] Mar 19 09:35:37.340931 master-0 kubenswrapper[15202]: I0319 09:35:37.340898 15202 scope.go:117] "RemoveContainer" containerID="2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc" Mar 19 09:35:37.344511 master-0 kubenswrapper[15202]: E0319 09:35:37.344436 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc\": container with ID starting with 2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc not found: ID does not exist" containerID="2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc" Mar 19 09:35:37.344686 master-0 kubenswrapper[15202]: I0319 09:35:37.344530 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc"} err="failed to get container status \"2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc\": rpc error: code = NotFound desc = could not find container \"2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc\": container with ID starting with 2935d77af42063a40576bf181b1a6813735f90ec6e86d96e1b5fc0e4396ac2dc not found: ID does not exist" Mar 19 09:35:37.344686 master-0 kubenswrapper[15202]: I0319 09:35:37.344578 15202 scope.go:117] "RemoveContainer" containerID="17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc" Mar 19 09:35:37.346196 master-0 kubenswrapper[15202]: E0319 09:35:37.346159 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc\": container with ID starting with 17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc not found: ID does not exist" containerID="17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc" Mar 19 09:35:37.346282 master-0 kubenswrapper[15202]: I0319 09:35:37.346198 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc"} err="failed to get container status \"17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc\": rpc error: code = NotFound desc = could not find container \"17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc\": container with ID starting with 17b67d9ea7f78e93ef5e0de227f38716f30df75d35a059fa4c6abe99735d84fc not found: ID does not exist" Mar 19 09:35:37.346282 master-0 kubenswrapper[15202]: I0319 09:35:37.346222 15202 scope.go:117] "RemoveContainer" containerID="724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676" Mar 19 09:35:37.350841 master-0 kubenswrapper[15202]: I0319 09:35:37.350566 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67d4b5c54d-v56p6"] Mar 19 09:35:37.356572 master-0 kubenswrapper[15202]: I0319 09:35:37.356517 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv"] Mar 19 09:35:37.364402 master-0 kubenswrapper[15202]: I0319 09:35:37.364344 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f995956-qbmvv"] Mar 19 09:35:37.369886 master-0 kubenswrapper[15202]: I0319 09:35:37.369185 15202 scope.go:117] "RemoveContainer" containerID="724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676" Mar 19 09:35:37.369886 master-0 kubenswrapper[15202]: E0319 09:35:37.369847 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676\": container with ID starting with 724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676 not found: ID does not exist" containerID="724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676" Mar 19 09:35:37.370039 master-0 kubenswrapper[15202]: I0319 09:35:37.369882 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676"} err="failed to get container status \"724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676\": rpc error: code = NotFound desc = could not find container \"724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676\": container with ID starting with 724900467a71db54d5dd87bb9c0ac2c346d3dd663fdeaf16c652a0fd36be3676 not found: ID does not exist" Mar 19 09:35:38.824195 master-0 kubenswrapper[15202]: I0319 09:35:38.824096 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" path="/var/lib/kubelet/pods/0f3617ef-6143-4fb4-8c84-90ce9c6be531/volumes" Mar 19 09:35:38.824935 master-0 kubenswrapper[15202]: I0319 09:35:38.824916 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f7f5e9-658c-44a4-a42a-544247b24195" path="/var/lib/kubelet/pods/a2f7f5e9-658c-44a4-a42a-544247b24195/volumes" Mar 19 09:35:42.160730 master-0 kubenswrapper[15202]: I0319 09:35:42.151406 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:35:42.160730 master-0 kubenswrapper[15202]: I0319 09:35:42.151760 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-5-master-0" podUID="84e80953-2ee6-4580-b4a4-0f85fdacaf8f" containerName="installer" containerID="cri-o://3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f" gracePeriod=30 Mar 19 09:35:43.993441 master-0 kubenswrapper[15202]: I0319 09:35:43.993356 15202 scope.go:117] "RemoveContainer" containerID="c07894aa55def3d2147701356df1f2900a277d1378259aefae49e515291dc919" Mar 19 09:35:45.548205 master-0 kubenswrapper[15202]: I0319 09:35:45.548103 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 09:35:45.548984 master-0 kubenswrapper[15202]: E0319 09:35:45.548951 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" Mar 19 09:35:45.548984 master-0 kubenswrapper[15202]: I0319 09:35:45.548979 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" Mar 19 09:35:45.549116 master-0 kubenswrapper[15202]: E0319 09:35:45.549012 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f7f5e9-658c-44a4-a42a-544247b24195" containerName="route-controller-manager" Mar 19 09:35:45.549116 master-0 kubenswrapper[15202]: I0319 09:35:45.549022 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f7f5e9-658c-44a4-a42a-544247b24195" containerName="route-controller-manager" Mar 19 09:35:45.551082 master-0 kubenswrapper[15202]: I0319 09:35:45.549460 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f7f5e9-658c-44a4-a42a-544247b24195" containerName="route-controller-manager" Mar 19 09:35:45.551184 master-0 kubenswrapper[15202]: I0319 09:35:45.551154 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" Mar 19 09:35:45.551348 master-0 kubenswrapper[15202]: I0319 09:35:45.551193 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" Mar 19 09:35:45.552465 master-0 kubenswrapper[15202]: I0319 09:35:45.552431 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.572609 master-0 kubenswrapper[15202]: I0319 09:35:45.572521 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 09:35:45.587137 master-0 kubenswrapper[15202]: I0319 09:35:45.585177 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f348ddf-ee67-4f81-9c16-b35d1a918669-kube-api-access\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.587137 master-0 kubenswrapper[15202]: I0319 09:35:45.585439 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.587137 master-0 kubenswrapper[15202]: I0319 09:35:45.585549 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-var-lock\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.686963 master-0 kubenswrapper[15202]: I0319 09:35:45.686842 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-var-lock\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.687456 master-0 kubenswrapper[15202]: I0319 09:35:45.687357 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-var-lock\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.687729 master-0 kubenswrapper[15202]: I0319 09:35:45.687665 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f348ddf-ee67-4f81-9c16-b35d1a918669-kube-api-access\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.687883 master-0 kubenswrapper[15202]: I0319 09:35:45.687863 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.688028 master-0 kubenswrapper[15202]: I0319 09:35:45.687999 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.705458 master-0 kubenswrapper[15202]: I0319 09:35:45.705375 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f348ddf-ee67-4f81-9c16-b35d1a918669-kube-api-access\") pod \"installer-6-master-0\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:45.895921 master-0 kubenswrapper[15202]: I0319 09:35:45.895516 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:35:46.487604 master-0 kubenswrapper[15202]: I0319 09:35:46.487510 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 09:35:46.489000 master-0 kubenswrapper[15202]: W0319 09:35:46.488863 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8f348ddf_ee67_4f81_9c16_b35d1a918669.slice/crio-b8f8a65533fbb387492d28abc45adacb1d94c03b5766c008e8757cddf005ddc7 WatchSource:0}: Error finding container b8f8a65533fbb387492d28abc45adacb1d94c03b5766c008e8757cddf005ddc7: Status 404 returned error can't find the container with id b8f8a65533fbb387492d28abc45adacb1d94c03b5766c008e8757cddf005ddc7 Mar 19 09:35:47.374745 master-0 kubenswrapper[15202]: I0319 09:35:47.374647 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"8f348ddf-ee67-4f81-9c16-b35d1a918669","Type":"ContainerStarted","Data":"e2243249baeedf2fe21ce14cc48c1196cadeee34e720ae9084b1811ac9f8a731"} Mar 19 09:35:47.374745 master-0 kubenswrapper[15202]: I0319 09:35:47.374739 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"8f348ddf-ee67-4f81-9c16-b35d1a918669","Type":"ContainerStarted","Data":"b8f8a65533fbb387492d28abc45adacb1d94c03b5766c008e8757cddf005ddc7"} Mar 19 09:35:47.398706 master-0 kubenswrapper[15202]: I0319 09:35:47.398595 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-0" podStartSLOduration=2.398552573 podStartE2EDuration="2.398552573s" podCreationTimestamp="2026-03-19 09:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:35:47.393535364 +0000 UTC m=+664.778950190" watchObservedRunningTime="2026-03-19 09:35:47.398552573 +0000 UTC m=+664.783967389" Mar 19 09:35:58.733183 master-0 kubenswrapper[15202]: I0319 09:35:58.733040 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.733808 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" containerID="cri-o://43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" gracePeriod=30 Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.733956 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" containerID="cri-o://514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" gracePeriod=30 Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734146 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734596 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" containerID="cri-o://41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" gracePeriod=30 Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: E0319 09:35:58.734757 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734785 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: E0319 09:35:58.734807 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="wait-for-host-port" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734816 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="wait-for-host-port" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: E0319 09:35:58.734832 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734840 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: E0319 09:35:58.734871 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734878 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: E0319 09:35:58.734901 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734910 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: E0319 09:35:58.734921 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" Mar 19 09:35:58.734922 master-0 kubenswrapper[15202]: I0319 09:35:58.734929 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3617ef-6143-4fb4-8c84-90ce9c6be531" containerName="controller-manager" Mar 19 09:35:58.737699 master-0 kubenswrapper[15202]: I0319 09:35:58.735145 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 09:35:58.737699 master-0 kubenswrapper[15202]: I0319 09:35:58.735171 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 09:35:58.737699 master-0 kubenswrapper[15202]: I0319 09:35:58.735189 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:35:58.737699 master-0 kubenswrapper[15202]: I0319 09:35:58.735200 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 09:35:58.882806 master-0 kubenswrapper[15202]: I0319 09:35:58.882707 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.883031 master-0 kubenswrapper[15202]: I0319 09:35:58.882968 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.910044 master-0 kubenswrapper[15202]: I0319 09:35:58.909917 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:35:58.911122 master-0 kubenswrapper[15202]: I0319 09:35:58.911026 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:35:58.911710 master-0 kubenswrapper[15202]: I0319 09:35:58.911672 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.916601 master-0 kubenswrapper[15202]: I0319 09:35:58.916407 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 09:35:58.985415 master-0 kubenswrapper[15202]: I0319 09:35:58.985260 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"8413125cf444e5c95f023c5dd9c6151e\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " Mar 19 09:35:58.985651 master-0 kubenswrapper[15202]: I0319 09:35:58.985456 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"8413125cf444e5c95f023c5dd9c6151e\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " Mar 19 09:35:58.985651 master-0 kubenswrapper[15202]: I0319 09:35:58.985506 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8413125cf444e5c95f023c5dd9c6151e" (UID: "8413125cf444e5c95f023c5dd9c6151e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:58.985651 master-0 kubenswrapper[15202]: I0319 09:35:58.985590 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8413125cf444e5c95f023c5dd9c6151e" (UID: "8413125cf444e5c95f023c5dd9c6151e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:35:58.986114 master-0 kubenswrapper[15202]: I0319 09:35:58.986028 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.986214 master-0 kubenswrapper[15202]: I0319 09:35:58.986177 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.986288 master-0 kubenswrapper[15202]: I0319 09:35:58.986252 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.986449 master-0 kubenswrapper[15202]: I0319 09:35:58.986394 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:58.986716 master-0 kubenswrapper[15202]: I0319 09:35:58.986643 15202 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:58.986716 master-0 kubenswrapper[15202]: I0319 09:35:58.986706 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:35:59.483696 master-0 kubenswrapper[15202]: I0319 09:35:59.483638 15202 generic.go:334] "Generic (PLEG): container finished" podID="0bbde12c-dfc2-434e-8bf4-f8cb88316a25" containerID="1b110177afb9f355c80af76f32a5de6fefcf2666818676e5a1c8fc118db1e735" exitCode=0 Mar 19 09:35:59.483967 master-0 kubenswrapper[15202]: I0319 09:35:59.483754 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0bbde12c-dfc2-434e-8bf4-f8cb88316a25","Type":"ContainerDied","Data":"1b110177afb9f355c80af76f32a5de6fefcf2666818676e5a1c8fc118db1e735"} Mar 19 09:35:59.486947 master-0 kubenswrapper[15202]: I0319 09:35:59.486893 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 09:35:59.487621 master-0 kubenswrapper[15202]: I0319 09:35:59.487587 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 09:35:59.488097 master-0 kubenswrapper[15202]: I0319 09:35:59.488059 15202 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" exitCode=0 Mar 19 09:35:59.488160 master-0 kubenswrapper[15202]: I0319 09:35:59.488098 15202 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" exitCode=0 Mar 19 09:35:59.488160 master-0 kubenswrapper[15202]: I0319 09:35:59.488112 15202 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" exitCode=2 Mar 19 09:35:59.488235 master-0 kubenswrapper[15202]: I0319 09:35:59.488169 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:35:59.488305 master-0 kubenswrapper[15202]: I0319 09:35:59.488173 15202 scope.go:117] "RemoveContainer" containerID="41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" Mar 19 09:35:59.512350 master-0 kubenswrapper[15202]: I0319 09:35:59.512278 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 09:35:59.515803 master-0 kubenswrapper[15202]: I0319 09:35:59.515389 15202 scope.go:117] "RemoveContainer" containerID="514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" Mar 19 09:35:59.533751 master-0 kubenswrapper[15202]: I0319 09:35:59.533691 15202 scope.go:117] "RemoveContainer" containerID="43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" Mar 19 09:35:59.549869 master-0 kubenswrapper[15202]: I0319 09:35:59.548497 15202 scope.go:117] "RemoveContainer" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" Mar 19 09:35:59.566769 master-0 kubenswrapper[15202]: I0319 09:35:59.566702 15202 scope.go:117] "RemoveContainer" containerID="43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc" Mar 19 09:35:59.584157 master-0 kubenswrapper[15202]: I0319 09:35:59.584105 15202 scope.go:117] "RemoveContainer" containerID="41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" Mar 19 09:35:59.584765 master-0 kubenswrapper[15202]: E0319 09:35:59.584704 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": container with ID starting with 41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682 not found: ID does not exist" containerID="41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" Mar 19 09:35:59.584818 master-0 kubenswrapper[15202]: I0319 09:35:59.584787 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682"} err="failed to get container status \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": rpc error: code = NotFound desc = could not find container \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": container with ID starting with 41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682 not found: ID does not exist" Mar 19 09:35:59.584860 master-0 kubenswrapper[15202]: I0319 09:35:59.584830 15202 scope.go:117] "RemoveContainer" containerID="514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" Mar 19 09:35:59.585556 master-0 kubenswrapper[15202]: E0319 09:35:59.585505 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": container with ID starting with 514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104 not found: ID does not exist" containerID="514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" Mar 19 09:35:59.585622 master-0 kubenswrapper[15202]: I0319 09:35:59.585562 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104"} err="failed to get container status \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": rpc error: code = NotFound desc = could not find container \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": container with ID starting with 514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104 not found: ID does not exist" Mar 19 09:35:59.585622 master-0 kubenswrapper[15202]: I0319 09:35:59.585596 15202 scope.go:117] "RemoveContainer" containerID="43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" Mar 19 09:35:59.586205 master-0 kubenswrapper[15202]: E0319 09:35:59.586162 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": container with ID starting with 43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914 not found: ID does not exist" containerID="43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" Mar 19 09:35:59.586261 master-0 kubenswrapper[15202]: I0319 09:35:59.586202 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914"} err="failed to get container status \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": rpc error: code = NotFound desc = could not find container \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": container with ID starting with 43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914 not found: ID does not exist" Mar 19 09:35:59.586261 master-0 kubenswrapper[15202]: I0319 09:35:59.586222 15202 scope.go:117] "RemoveContainer" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" Mar 19 09:35:59.586672 master-0 kubenswrapper[15202]: E0319 09:35:59.586636 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": container with ID starting with 4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a not found: ID does not exist" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" Mar 19 09:35:59.586672 master-0 kubenswrapper[15202]: I0319 09:35:59.586665 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a"} err="failed to get container status \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": rpc error: code = NotFound desc = could not find container \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": container with ID starting with 4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a not found: ID does not exist" Mar 19 09:35:59.586754 master-0 kubenswrapper[15202]: I0319 09:35:59.586681 15202 scope.go:117] "RemoveContainer" containerID="43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc" Mar 19 09:35:59.587047 master-0 kubenswrapper[15202]: E0319 09:35:59.587008 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": container with ID starting with 43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc not found: ID does not exist" containerID="43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc" Mar 19 09:35:59.587129 master-0 kubenswrapper[15202]: I0319 09:35:59.587042 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc"} err="failed to get container status \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": rpc error: code = NotFound desc = could not find container \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": container with ID starting with 43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc not found: ID does not exist" Mar 19 09:35:59.587129 master-0 kubenswrapper[15202]: I0319 09:35:59.587060 15202 scope.go:117] "RemoveContainer" containerID="41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" Mar 19 09:35:59.587616 master-0 kubenswrapper[15202]: I0319 09:35:59.587574 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682"} err="failed to get container status \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": rpc error: code = NotFound desc = could not find container \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": container with ID starting with 41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682 not found: ID does not exist" Mar 19 09:35:59.587667 master-0 kubenswrapper[15202]: I0319 09:35:59.587614 15202 scope.go:117] "RemoveContainer" containerID="514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" Mar 19 09:35:59.588153 master-0 kubenswrapper[15202]: I0319 09:35:59.588099 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104"} err="failed to get container status \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": rpc error: code = NotFound desc = could not find container \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": container with ID starting with 514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104 not found: ID does not exist" Mar 19 09:35:59.588153 master-0 kubenswrapper[15202]: I0319 09:35:59.588144 15202 scope.go:117] "RemoveContainer" containerID="43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" Mar 19 09:35:59.588439 master-0 kubenswrapper[15202]: I0319 09:35:59.588395 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914"} err="failed to get container status \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": rpc error: code = NotFound desc = could not find container \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": container with ID starting with 43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914 not found: ID does not exist" Mar 19 09:35:59.588439 master-0 kubenswrapper[15202]: I0319 09:35:59.588430 15202 scope.go:117] "RemoveContainer" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" Mar 19 09:35:59.589034 master-0 kubenswrapper[15202]: I0319 09:35:59.588970 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a"} err="failed to get container status \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": rpc error: code = NotFound desc = could not find container \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": container with ID starting with 4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a not found: ID does not exist" Mar 19 09:35:59.589034 master-0 kubenswrapper[15202]: I0319 09:35:59.589002 15202 scope.go:117] "RemoveContainer" containerID="43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc" Mar 19 09:35:59.589548 master-0 kubenswrapper[15202]: I0319 09:35:59.589512 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc"} err="failed to get container status \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": rpc error: code = NotFound desc = could not find container \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": container with ID starting with 43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc not found: ID does not exist" Mar 19 09:35:59.589548 master-0 kubenswrapper[15202]: I0319 09:35:59.589542 15202 scope.go:117] "RemoveContainer" containerID="41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682" Mar 19 09:35:59.589943 master-0 kubenswrapper[15202]: I0319 09:35:59.589895 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682"} err="failed to get container status \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": rpc error: code = NotFound desc = could not find container \"41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682\": container with ID starting with 41f913827a845b67bbbc38eb87006fcddfb0dbc94e5a60437d54d4e8b6b91682 not found: ID does not exist" Mar 19 09:35:59.589943 master-0 kubenswrapper[15202]: I0319 09:35:59.589933 15202 scope.go:117] "RemoveContainer" containerID="514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104" Mar 19 09:35:59.590447 master-0 kubenswrapper[15202]: I0319 09:35:59.590263 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104"} err="failed to get container status \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": rpc error: code = NotFound desc = could not find container \"514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104\": container with ID starting with 514ceb2f5bea70f1f522c4eb67017c3e366cec44bc4e4f535b5a51bd93545104 not found: ID does not exist" Mar 19 09:35:59.590447 master-0 kubenswrapper[15202]: I0319 09:35:59.590293 15202 scope.go:117] "RemoveContainer" containerID="43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914" Mar 19 09:35:59.590687 master-0 kubenswrapper[15202]: I0319 09:35:59.590645 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914"} err="failed to get container status \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": rpc error: code = NotFound desc = could not find container \"43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914\": container with ID starting with 43a8d8d91f8b54905b7f2e1ba38ea85b5e616e12843e3acff892c8cbbe089914 not found: ID does not exist" Mar 19 09:35:59.590687 master-0 kubenswrapper[15202]: I0319 09:35:59.590674 15202 scope.go:117] "RemoveContainer" containerID="4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a" Mar 19 09:35:59.591109 master-0 kubenswrapper[15202]: I0319 09:35:59.591073 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a"} err="failed to get container status \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": rpc error: code = NotFound desc = could not find container \"4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a\": container with ID starting with 4da2ba5008e7bdd5ee5f42f41fddb98cc0cff67957f49927fdfeb8aec32b7b9a not found: ID does not exist" Mar 19 09:35:59.591109 master-0 kubenswrapper[15202]: I0319 09:35:59.591096 15202 scope.go:117] "RemoveContainer" containerID="43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc" Mar 19 09:35:59.591569 master-0 kubenswrapper[15202]: I0319 09:35:59.591533 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc"} err="failed to get container status \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": rpc error: code = NotFound desc = could not find container \"43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc\": container with ID starting with 43c00b8ca962810634e5f4c7d5386c9000bf176ba084a80f89de3350ef6f5abc not found: ID does not exist" Mar 19 09:36:00.829930 master-0 kubenswrapper[15202]: I0319 09:36:00.829732 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8413125cf444e5c95f023c5dd9c6151e" path="/var/lib/kubelet/pods/8413125cf444e5c95f023c5dd9c6151e/volumes" Mar 19 09:36:00.901880 master-0 kubenswrapper[15202]: I0319 09:36:00.901793 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:36:01.027257 master-0 kubenswrapper[15202]: I0319 09:36:01.027158 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-var-lock\") pod \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " Mar 19 09:36:01.027589 master-0 kubenswrapper[15202]: I0319 09:36:01.027295 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kube-api-access\") pod \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " Mar 19 09:36:01.027589 master-0 kubenswrapper[15202]: I0319 09:36:01.027356 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-var-lock" (OuterVolumeSpecName: "var-lock") pod "0bbde12c-dfc2-434e-8bf4-f8cb88316a25" (UID: "0bbde12c-dfc2-434e-8bf4-f8cb88316a25"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:01.027589 master-0 kubenswrapper[15202]: I0319 09:36:01.027369 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kubelet-dir\") pod \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\" (UID: \"0bbde12c-dfc2-434e-8bf4-f8cb88316a25\") " Mar 19 09:36:01.027589 master-0 kubenswrapper[15202]: I0319 09:36:01.027395 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0bbde12c-dfc2-434e-8bf4-f8cb88316a25" (UID: "0bbde12c-dfc2-434e-8bf4-f8cb88316a25"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:01.027937 master-0 kubenswrapper[15202]: I0319 09:36:01.027896 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:01.027937 master-0 kubenswrapper[15202]: I0319 09:36:01.027928 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:01.030801 master-0 kubenswrapper[15202]: I0319 09:36:01.030713 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0bbde12c-dfc2-434e-8bf4-f8cb88316a25" (UID: "0bbde12c-dfc2-434e-8bf4-f8cb88316a25"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:01.129752 master-0 kubenswrapper[15202]: I0319 09:36:01.129538 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bbde12c-dfc2-434e-8bf4-f8cb88316a25-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:01.445812 master-0 kubenswrapper[15202]: I0319 09:36:01.445719 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" containerID="cri-o://bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd" gracePeriod=15 Mar 19 09:36:01.509120 master-0 kubenswrapper[15202]: I0319 09:36:01.509051 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 09:36:01.509331 master-0 kubenswrapper[15202]: I0319 09:36:01.509129 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"0bbde12c-dfc2-434e-8bf4-f8cb88316a25","Type":"ContainerDied","Data":"24f778520f5960effd837e92f64c900a5282374aa679c7da5a3181daceef1b4f"} Mar 19 09:36:01.509331 master-0 kubenswrapper[15202]: I0319 09:36:01.509226 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24f778520f5960effd837e92f64c900a5282374aa679c7da5a3181daceef1b4f" Mar 19 09:36:01.889599 master-0 kubenswrapper[15202]: I0319 09:36:01.889519 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:36:02.047675 master-0 kubenswrapper[15202]: I0319 09:36:02.047609 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-session\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.047822 master-0 kubenswrapper[15202]: I0319 09:36:02.047693 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-serving-cert\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.047822 master-0 kubenswrapper[15202]: I0319 09:36:02.047746 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-error\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.047920 master-0 kubenswrapper[15202]: I0319 09:36:02.047820 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-provider-selection\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.047920 master-0 kubenswrapper[15202]: I0319 09:36:02.047894 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-login\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.048012 master-0 kubenswrapper[15202]: I0319 09:36:02.047965 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-dir\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.048012 master-0 kubenswrapper[15202]: I0319 09:36:02.048000 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-policies\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.048096 master-0 kubenswrapper[15202]: I0319 09:36:02.048042 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-router-certs\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.048096 master-0 kubenswrapper[15202]: I0319 09:36:02.048072 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-cliconfig\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.048719 master-0 kubenswrapper[15202]: I0319 09:36:02.048642 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:02.048792 master-0 kubenswrapper[15202]: I0319 09:36:02.048771 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:02.048841 master-0 kubenswrapper[15202]: I0319 09:36:02.048100 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-ocp-branding-template\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.049098 master-0 kubenswrapper[15202]: I0319 09:36:02.049059 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt4h9\" (UniqueName: \"kubernetes.io/projected/5c2d7253-f08b-4aa3-b728-6012da77f513-kube-api-access-xt4h9\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.049245 master-0 kubenswrapper[15202]: I0319 09:36:02.049171 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-service-ca\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.049843 master-0 kubenswrapper[15202]: I0319 09:36:02.049151 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:02.049843 master-0 kubenswrapper[15202]: I0319 09:36:02.049701 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:02.049945 master-0 kubenswrapper[15202]: I0319 09:36:02.049793 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-trusted-ca-bundle\") pod \"5c2d7253-f08b-4aa3-b728-6012da77f513\" (UID: \"5c2d7253-f08b-4aa3-b728-6012da77f513\") " Mar 19 09:36:02.050654 master-0 kubenswrapper[15202]: I0319 09:36:02.050585 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:02.050654 master-0 kubenswrapper[15202]: I0319 09:36:02.050651 15202 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.050790 master-0 kubenswrapper[15202]: I0319 09:36:02.050669 15202 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.050790 master-0 kubenswrapper[15202]: I0319 09:36:02.050680 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.050790 master-0 kubenswrapper[15202]: I0319 09:36:02.050740 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.052689 master-0 kubenswrapper[15202]: I0319 09:36:02.052653 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.052782 master-0 kubenswrapper[15202]: I0319 09:36:02.052714 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.053334 master-0 kubenswrapper[15202]: I0319 09:36:02.053236 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.053492 master-0 kubenswrapper[15202]: I0319 09:36:02.053413 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.053564 master-0 kubenswrapper[15202]: I0319 09:36:02.053457 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.053611 master-0 kubenswrapper[15202]: I0319 09:36:02.053590 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.054027 master-0 kubenswrapper[15202]: I0319 09:36:02.053978 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2d7253-f08b-4aa3-b728-6012da77f513-kube-api-access-xt4h9" (OuterVolumeSpecName: "kube-api-access-xt4h9") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "kube-api-access-xt4h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:02.054098 master-0 kubenswrapper[15202]: I0319 09:36:02.054079 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5c2d7253-f08b-4aa3-b728-6012da77f513" (UID: "5c2d7253-f08b-4aa3-b728-6012da77f513"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:02.152618 master-0 kubenswrapper[15202]: I0319 09:36:02.152502 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.152618 master-0 kubenswrapper[15202]: I0319 09:36:02.152639 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152656 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt4h9\" (UniqueName: \"kubernetes.io/projected/5c2d7253-f08b-4aa3-b728-6012da77f513-kube-api-access-xt4h9\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152667 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152679 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152700 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152711 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152721 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.153037 master-0 kubenswrapper[15202]: I0319 09:36:02.152731 15202 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c2d7253-f08b-4aa3-b728-6012da77f513-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.371114 master-0 kubenswrapper[15202]: I0319 09:36:02.371040 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_84e80953-2ee6-4580-b4a4-0f85fdacaf8f/installer/0.log" Mar 19 09:36:02.371333 master-0 kubenswrapper[15202]: I0319 09:36:02.371140 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:36:02.456454 master-0 kubenswrapper[15202]: I0319 09:36:02.456371 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kubelet-dir\") pod \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " Mar 19 09:36:02.456454 master-0 kubenswrapper[15202]: I0319 09:36:02.456458 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-var-lock\") pod \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " Mar 19 09:36:02.456813 master-0 kubenswrapper[15202]: I0319 09:36:02.456588 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "84e80953-2ee6-4580-b4a4-0f85fdacaf8f" (UID: "84e80953-2ee6-4580-b4a4-0f85fdacaf8f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:02.456813 master-0 kubenswrapper[15202]: I0319 09:36:02.456740 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kube-api-access\") pod \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\" (UID: \"84e80953-2ee6-4580-b4a4-0f85fdacaf8f\") " Mar 19 09:36:02.456896 master-0 kubenswrapper[15202]: I0319 09:36:02.456768 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-var-lock" (OuterVolumeSpecName: "var-lock") pod "84e80953-2ee6-4580-b4a4-0f85fdacaf8f" (UID: "84e80953-2ee6-4580-b4a4-0f85fdacaf8f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:02.457222 master-0 kubenswrapper[15202]: I0319 09:36:02.457195 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.457222 master-0 kubenswrapper[15202]: I0319 09:36:02.457215 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.460618 master-0 kubenswrapper[15202]: I0319 09:36:02.460577 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "84e80953-2ee6-4580-b4a4-0f85fdacaf8f" (UID: "84e80953-2ee6-4580-b4a4-0f85fdacaf8f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:02.522389 master-0 kubenswrapper[15202]: I0319 09:36:02.522315 15202 generic.go:334] "Generic (PLEG): container finished" podID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerID="bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd" exitCode=0 Mar 19 09:36:02.522671 master-0 kubenswrapper[15202]: I0319 09:36:02.522407 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" event={"ID":"5c2d7253-f08b-4aa3-b728-6012da77f513","Type":"ContainerDied","Data":"bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd"} Mar 19 09:36:02.522671 master-0 kubenswrapper[15202]: I0319 09:36:02.522448 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" event={"ID":"5c2d7253-f08b-4aa3-b728-6012da77f513","Type":"ContainerDied","Data":"8a3d4063a2259d6d3f04875dc5120e12eb59eab6fac456381c57fc80ddc8110a"} Mar 19 09:36:02.522671 master-0 kubenswrapper[15202]: I0319 09:36:02.522490 15202 scope.go:117] "RemoveContainer" containerID="bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd" Mar 19 09:36:02.523605 master-0 kubenswrapper[15202]: I0319 09:36:02.523366 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5455ddcb95-p88pn" Mar 19 09:36:02.535841 master-0 kubenswrapper[15202]: I0319 09:36:02.535768 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_84e80953-2ee6-4580-b4a4-0f85fdacaf8f/installer/0.log" Mar 19 09:36:02.536164 master-0 kubenswrapper[15202]: I0319 09:36:02.535854 15202 generic.go:334] "Generic (PLEG): container finished" podID="84e80953-2ee6-4580-b4a4-0f85fdacaf8f" containerID="3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f" exitCode=1 Mar 19 09:36:02.536164 master-0 kubenswrapper[15202]: I0319 09:36:02.535904 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84e80953-2ee6-4580-b4a4-0f85fdacaf8f","Type":"ContainerDied","Data":"3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f"} Mar 19 09:36:02.536164 master-0 kubenswrapper[15202]: I0319 09:36:02.535943 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84e80953-2ee6-4580-b4a4-0f85fdacaf8f","Type":"ContainerDied","Data":"1b9c4bafde036e8f5fac7de0c4b531440c8cc03cbc48b0cdb3a428633297f388"} Mar 19 09:36:02.536164 master-0 kubenswrapper[15202]: I0319 09:36:02.535938 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 09:36:02.561961 master-0 kubenswrapper[15202]: I0319 09:36:02.561872 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e80953-2ee6-4580-b4a4-0f85fdacaf8f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:02.568542 master-0 kubenswrapper[15202]: I0319 09:36:02.568484 15202 scope.go:117] "RemoveContainer" containerID="bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd" Mar 19 09:36:02.569061 master-0 kubenswrapper[15202]: E0319 09:36:02.569026 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd\": container with ID starting with bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd not found: ID does not exist" containerID="bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd" Mar 19 09:36:02.569131 master-0 kubenswrapper[15202]: I0319 09:36:02.569068 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd"} err="failed to get container status \"bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd\": rpc error: code = NotFound desc = could not find container \"bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd\": container with ID starting with bda34250610e2cc31e2054b3cf632c94b9f9b40d5765ef4f67e4015c425acbbd not found: ID does not exist" Mar 19 09:36:02.569131 master-0 kubenswrapper[15202]: I0319 09:36:02.569096 15202 scope.go:117] "RemoveContainer" containerID="3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f" Mar 19 09:36:02.581834 master-0 kubenswrapper[15202]: I0319 09:36:02.581744 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5455ddcb95-p88pn"] Mar 19 09:36:02.587507 master-0 kubenswrapper[15202]: I0319 09:36:02.587419 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5455ddcb95-p88pn"] Mar 19 09:36:02.608195 master-0 kubenswrapper[15202]: I0319 09:36:02.603352 15202 scope.go:117] "RemoveContainer" containerID="3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f" Mar 19 09:36:02.608195 master-0 kubenswrapper[15202]: E0319 09:36:02.604012 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f\": container with ID starting with 3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f not found: ID does not exist" containerID="3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f" Mar 19 09:36:02.608195 master-0 kubenswrapper[15202]: I0319 09:36:02.604077 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f"} err="failed to get container status \"3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f\": rpc error: code = NotFound desc = could not find container \"3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f\": container with ID starting with 3fb5694c53eabcbb70127e2babb9cd92e656f2ad44bf37a0ab172e1ca4ac3c4f not found: ID does not exist" Mar 19 09:36:02.617949 master-0 kubenswrapper[15202]: I0319 09:36:02.617838 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:36:02.626627 master-0 kubenswrapper[15202]: I0319 09:36:02.626543 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 09:36:02.825845 master-0 kubenswrapper[15202]: I0319 09:36:02.825619 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" path="/var/lib/kubelet/pods/5c2d7253-f08b-4aa3-b728-6012da77f513/volumes" Mar 19 09:36:02.826906 master-0 kubenswrapper[15202]: I0319 09:36:02.826847 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e80953-2ee6-4580-b4a4-0f85fdacaf8f" path="/var/lib/kubelet/pods/84e80953-2ee6-4580-b4a4-0f85fdacaf8f/volumes" Mar 19 09:36:02.906493 master-0 kubenswrapper[15202]: I0319 09:36:02.906402 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:36:02.907124 master-0 kubenswrapper[15202]: I0319 09:36:02.906879 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://104d66d823f67f2d0db81952b3e75346a0594dd7f2e33f5fb4f808501d9d251d" gracePeriod=30 Mar 19 09:36:02.907124 master-0 kubenswrapper[15202]: I0319 09:36:02.906879 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" containerID="cri-o://ebf1733b19e744225a9e8c315e74e98e73b2be2483625475391f4ed66449d3f3" gracePeriod=30 Mar 19 09:36:02.907256 master-0 kubenswrapper[15202]: I0319 09:36:02.906863 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://3c9753a11d434b49ac1c7706c0cfb9ad45a06cf5f0dedce5c7137c69786d006a" gracePeriod=30 Mar 19 09:36:02.907388 master-0 kubenswrapper[15202]: I0319 09:36:02.906804 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager" containerID="cri-o://aa9ca5f81be4d21b4000c8f0fdd07fdc216eb78f16c7cfa49b4fbe85f9057a8c" gracePeriod=30 Mar 19 09:36:02.908248 master-0 kubenswrapper[15202]: I0319 09:36:02.908190 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:36:02.908681 master-0 kubenswrapper[15202]: E0319 09:36:02.908641 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e80953-2ee6-4580-b4a4-0f85fdacaf8f" containerName="installer" Mar 19 09:36:02.908681 master-0 kubenswrapper[15202]: I0319 09:36:02.908673 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e80953-2ee6-4580-b4a4-0f85fdacaf8f" containerName="installer" Mar 19 09:36:02.908779 master-0 kubenswrapper[15202]: E0319 09:36:02.908701 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.908779 master-0 kubenswrapper[15202]: I0319 09:36:02.908711 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.908779 master-0 kubenswrapper[15202]: E0319 09:36:02.908742 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-recovery-controller" Mar 19 09:36:02.908779 master-0 kubenswrapper[15202]: I0319 09:36:02.908751 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-recovery-controller" Mar 19 09:36:02.908779 master-0 kubenswrapper[15202]: E0319 09:36:02.908761 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbde12c-dfc2-434e-8bf4-f8cb88316a25" containerName="installer" Mar 19 09:36:02.908779 master-0 kubenswrapper[15202]: I0319 09:36:02.908769 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbde12c-dfc2-434e-8bf4-f8cb88316a25" containerName="installer" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: E0319 09:36:02.908802 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: I0319 09:36:02.908813 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: E0319 09:36:02.908823 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: I0319 09:36:02.908831 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: E0319 09:36:02.908840 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: I0319 09:36:02.908849 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: E0319 09:36:02.908864 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-cert-syncer" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: I0319 09:36:02.908872 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-cert-syncer" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: E0319 09:36:02.908888 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" Mar 19 09:36:02.909026 master-0 kubenswrapper[15202]: I0319 09:36:02.908897 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909088 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbde12c-dfc2-434e-8bf4-f8cb88316a25" containerName="installer" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909129 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-cert-syncer" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909141 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909157 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager-recovery-controller" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909173 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909184 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e4b90a881a688f81bb1f315628150f" containerName="kube-controller-manager" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909196 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e4b90a881a688f81bb1f315628150f" containerName="cluster-policy-controller" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909205 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2d7253-f08b-4aa3-b728-6012da77f513" containerName="oauth-openshift" Mar 19 09:36:02.909388 master-0 kubenswrapper[15202]: I0319 09:36:02.909219 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e80953-2ee6-4580-b4a4-0f85fdacaf8f" containerName="installer" Mar 19 09:36:03.070563 master-0 kubenswrapper[15202]: I0319 09:36:03.070505 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/beb38ec27e482ba63d3c0762a843676a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"beb38ec27e482ba63d3c0762a843676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.070792 master-0 kubenswrapper[15202]: I0319 09:36:03.070568 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/beb38ec27e482ba63d3c0762a843676a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"beb38ec27e482ba63d3c0762a843676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.153459 master-0 kubenswrapper[15202]: I0319 09:36:03.153302 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/cluster-policy-controller/1.log" Mar 19 09:36:03.154789 master-0 kubenswrapper[15202]: I0319 09:36:03.154752 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/kube-controller-manager-cert-syncer/0.log" Mar 19 09:36:03.155550 master-0 kubenswrapper[15202]: I0319 09:36:03.155516 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.160734 master-0 kubenswrapper[15202]: I0319 09:36:03.160698 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="56e4b90a881a688f81bb1f315628150f" podUID="beb38ec27e482ba63d3c0762a843676a" Mar 19 09:36:03.172045 master-0 kubenswrapper[15202]: I0319 09:36:03.171969 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/beb38ec27e482ba63d3c0762a843676a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"beb38ec27e482ba63d3c0762a843676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.172249 master-0 kubenswrapper[15202]: I0319 09:36:03.172185 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/beb38ec27e482ba63d3c0762a843676a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"beb38ec27e482ba63d3c0762a843676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.172302 master-0 kubenswrapper[15202]: I0319 09:36:03.172250 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/beb38ec27e482ba63d3c0762a843676a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"beb38ec27e482ba63d3c0762a843676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.172699 master-0 kubenswrapper[15202]: I0319 09:36:03.172432 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/beb38ec27e482ba63d3c0762a843676a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"beb38ec27e482ba63d3c0762a843676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.273592 master-0 kubenswrapper[15202]: I0319 09:36:03.273532 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-cert-dir\") pod \"56e4b90a881a688f81bb1f315628150f\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " Mar 19 09:36:03.273904 master-0 kubenswrapper[15202]: I0319 09:36:03.273638 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-resource-dir\") pod \"56e4b90a881a688f81bb1f315628150f\" (UID: \"56e4b90a881a688f81bb1f315628150f\") " Mar 19 09:36:03.273904 master-0 kubenswrapper[15202]: I0319 09:36:03.273675 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "56e4b90a881a688f81bb1f315628150f" (UID: "56e4b90a881a688f81bb1f315628150f"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:03.273904 master-0 kubenswrapper[15202]: I0319 09:36:03.273739 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "56e4b90a881a688f81bb1f315628150f" (UID: "56e4b90a881a688f81bb1f315628150f"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:03.274110 master-0 kubenswrapper[15202]: I0319 09:36:03.274075 15202 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:03.274110 master-0 kubenswrapper[15202]: I0319 09:36:03.274099 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56e4b90a881a688f81bb1f315628150f-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:03.548357 master-0 kubenswrapper[15202]: I0319 09:36:03.548321 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/cluster-policy-controller/1.log" Mar 19 09:36:03.549399 master-0 kubenswrapper[15202]: I0319 09:36:03.549384 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/kube-controller-manager-cert-syncer/0.log" Mar 19 09:36:03.550018 master-0 kubenswrapper[15202]: I0319 09:36:03.549997 15202 generic.go:334] "Generic (PLEG): container finished" podID="56e4b90a881a688f81bb1f315628150f" containerID="ebf1733b19e744225a9e8c315e74e98e73b2be2483625475391f4ed66449d3f3" exitCode=0 Mar 19 09:36:03.550109 master-0 kubenswrapper[15202]: I0319 09:36:03.550096 15202 generic.go:334] "Generic (PLEG): container finished" podID="56e4b90a881a688f81bb1f315628150f" containerID="104d66d823f67f2d0db81952b3e75346a0594dd7f2e33f5fb4f808501d9d251d" exitCode=0 Mar 19 09:36:03.550172 master-0 kubenswrapper[15202]: I0319 09:36:03.550161 15202 generic.go:334] "Generic (PLEG): container finished" podID="56e4b90a881a688f81bb1f315628150f" containerID="3c9753a11d434b49ac1c7706c0cfb9ad45a06cf5f0dedce5c7137c69786d006a" exitCode=2 Mar 19 09:36:03.550229 master-0 kubenswrapper[15202]: I0319 09:36:03.550219 15202 generic.go:334] "Generic (PLEG): container finished" podID="56e4b90a881a688f81bb1f315628150f" containerID="aa9ca5f81be4d21b4000c8f0fdd07fdc216eb78f16c7cfa49b4fbe85f9057a8c" exitCode=0 Mar 19 09:36:03.550340 master-0 kubenswrapper[15202]: I0319 09:36:03.550326 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78f1c2e280836c1221080a867e5d75e5d53fea7242964b76feeee5cd30e104dd" Mar 19 09:36:03.550406 master-0 kubenswrapper[15202]: I0319 09:36:03.550396 15202 scope.go:117] "RemoveContainer" containerID="e6ecc8227a05dd132bbd77052651817fa06a0d1b4f6b5e8fb3f47ae909216004" Mar 19 09:36:03.550591 master-0 kubenswrapper[15202]: I0319 09:36:03.550574 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:03.554844 master-0 kubenswrapper[15202]: I0319 09:36:03.554763 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="56e4b90a881a688f81bb1f315628150f" podUID="beb38ec27e482ba63d3c0762a843676a" Mar 19 09:36:03.557228 master-0 kubenswrapper[15202]: I0319 09:36:03.557206 15202 generic.go:334] "Generic (PLEG): container finished" podID="af31aaf4-3e95-4505-9f5c-af88c1097638" containerID="4b2643abf6aa4d1022a5d527aa118dca32ff4506164827d1ac25495b362726cf" exitCode=0 Mar 19 09:36:03.557284 master-0 kubenswrapper[15202]: I0319 09:36:03.557238 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"af31aaf4-3e95-4505-9f5c-af88c1097638","Type":"ContainerDied","Data":"4b2643abf6aa4d1022a5d527aa118dca32ff4506164827d1ac25495b362726cf"} Mar 19 09:36:03.592254 master-0 kubenswrapper[15202]: I0319 09:36:03.592187 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="56e4b90a881a688f81bb1f315628150f" podUID="beb38ec27e482ba63d3c0762a843676a" Mar 19 09:36:04.572434 master-0 kubenswrapper[15202]: I0319 09:36:04.572343 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_56e4b90a881a688f81bb1f315628150f/kube-controller-manager-cert-syncer/0.log" Mar 19 09:36:04.821257 master-0 kubenswrapper[15202]: I0319 09:36:04.821139 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e4b90a881a688f81bb1f315628150f" path="/var/lib/kubelet/pods/56e4b90a881a688f81bb1f315628150f/volumes" Mar 19 09:36:04.940082 master-0 kubenswrapper[15202]: I0319 09:36:04.939995 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:36:05.057586 master-0 kubenswrapper[15202]: I0319 09:36:05.015103 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-var-lock\") pod \"af31aaf4-3e95-4505-9f5c-af88c1097638\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " Mar 19 09:36:05.057586 master-0 kubenswrapper[15202]: I0319 09:36:05.015246 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-kubelet-dir\") pod \"af31aaf4-3e95-4505-9f5c-af88c1097638\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " Mar 19 09:36:05.057586 master-0 kubenswrapper[15202]: I0319 09:36:05.015371 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af31aaf4-3e95-4505-9f5c-af88c1097638-kube-api-access\") pod \"af31aaf4-3e95-4505-9f5c-af88c1097638\" (UID: \"af31aaf4-3e95-4505-9f5c-af88c1097638\") " Mar 19 09:36:05.057586 master-0 kubenswrapper[15202]: I0319 09:36:05.017331 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-var-lock" (OuterVolumeSpecName: "var-lock") pod "af31aaf4-3e95-4505-9f5c-af88c1097638" (UID: "af31aaf4-3e95-4505-9f5c-af88c1097638"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:05.057586 master-0 kubenswrapper[15202]: I0319 09:36:05.017366 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af31aaf4-3e95-4505-9f5c-af88c1097638" (UID: "af31aaf4-3e95-4505-9f5c-af88c1097638"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:05.060874 master-0 kubenswrapper[15202]: I0319 09:36:05.060797 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af31aaf4-3e95-4505-9f5c-af88c1097638-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af31aaf4-3e95-4505-9f5c-af88c1097638" (UID: "af31aaf4-3e95-4505-9f5c-af88c1097638"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:05.117612 master-0 kubenswrapper[15202]: I0319 09:36:05.117504 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:05.117612 master-0 kubenswrapper[15202]: I0319 09:36:05.117573 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af31aaf4-3e95-4505-9f5c-af88c1097638-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:05.117612 master-0 kubenswrapper[15202]: I0319 09:36:05.117589 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af31aaf4-3e95-4505-9f5c-af88c1097638-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:05.585907 master-0 kubenswrapper[15202]: I0319 09:36:05.585826 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"af31aaf4-3e95-4505-9f5c-af88c1097638","Type":"ContainerDied","Data":"04259f2cb7a617207ea80e9b9b1c811da9e90edf3ac12d3f19e7a53df1d01604"} Mar 19 09:36:05.585907 master-0 kubenswrapper[15202]: I0319 09:36:05.585886 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04259f2cb7a617207ea80e9b9b1c811da9e90edf3ac12d3f19e7a53df1d01604" Mar 19 09:36:05.586831 master-0 kubenswrapper[15202]: I0319 09:36:05.586046 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 09:36:12.812132 master-0 kubenswrapper[15202]: I0319 09:36:12.812037 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:36:12.821072 master-0 kubenswrapper[15202]: I0319 09:36:12.821015 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:36:12.821337 master-0 kubenswrapper[15202]: I0319 09:36:12.821309 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:36:12.851606 master-0 kubenswrapper[15202]: I0319 09:36:12.848549 15202 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 19 09:36:12.854228 master-0 kubenswrapper[15202]: I0319 09:36:12.854170 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:36:12.861890 master-0 kubenswrapper[15202]: I0319 09:36:12.861828 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:36:12.865939 master-0 kubenswrapper[15202]: I0319 09:36:12.865861 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="cf111880-8321-4ccd-9173-30e27ab4fb9a" Mar 19 09:36:12.865939 master-0 kubenswrapper[15202]: I0319 09:36:12.865931 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="cf111880-8321-4ccd-9173-30e27ab4fb9a" Mar 19 09:36:12.878955 master-0 kubenswrapper[15202]: I0319 09:36:12.878896 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 09:36:12.891972 master-0 kubenswrapper[15202]: I0319 09:36:12.889361 15202 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:36:12.896688 master-0 kubenswrapper[15202]: I0319 09:36:12.896616 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:36:12.904437 master-0 kubenswrapper[15202]: I0319 09:36:12.904353 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:36:12.970107 master-0 kubenswrapper[15202]: I0319 09:36:12.970028 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:36:12.978545 master-0 kubenswrapper[15202]: I0319 09:36:12.978441 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 09:36:12.998276 master-0 kubenswrapper[15202]: W0319 09:36:12.998233 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e27b7d086edf5d2cf47b703574641d8.slice/crio-ca349de8af11495389a799ea17caf7b386b96b72a11bc773732f1758e885e157 WatchSource:0}: Error finding container ca349de8af11495389a799ea17caf7b386b96b72a11bc773732f1758e885e157: Status 404 returned error can't find the container with id ca349de8af11495389a799ea17caf7b386b96b72a11bc773732f1758e885e157 Mar 19 09:36:13.666693 master-0 kubenswrapper[15202]: I0319 09:36:13.666618 15202 generic.go:334] "Generic (PLEG): container finished" podID="8e27b7d086edf5d2cf47b703574641d8" containerID="f22abb93d67b5017ffc1e6fd24a3765822df7c61dd99a6d6c00a8da701cd689f" exitCode=0 Mar 19 09:36:13.666979 master-0 kubenswrapper[15202]: I0319 09:36:13.666754 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerDied","Data":"f22abb93d67b5017ffc1e6fd24a3765822df7c61dd99a6d6c00a8da701cd689f"} Mar 19 09:36:13.666979 master-0 kubenswrapper[15202]: I0319 09:36:13.666811 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"ca349de8af11495389a799ea17caf7b386b96b72a11bc773732f1758e885e157"} Mar 19 09:36:13.667318 master-0 kubenswrapper[15202]: I0319 09:36:13.667296 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:36:13.667318 master-0 kubenswrapper[15202]: I0319 09:36:13.667318 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="9d1b709b-b8b2-4920-af59-4ae781363b61" Mar 19 09:36:13.728547 master-0 kubenswrapper[15202]: I0319 09:36:13.728109 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.728078774 podStartE2EDuration="1.728078774s" podCreationTimestamp="2026-03-19 09:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:36:13.72407943 +0000 UTC m=+691.109494256" watchObservedRunningTime="2026-03-19 09:36:13.728078774 +0000 UTC m=+691.113493590" Mar 19 09:36:14.684528 master-0 kubenswrapper[15202]: I0319 09:36:14.684438 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"2a33f70166ba53215a1b577ba6482b5aad498b21501cbaca8a58622a5e51efc3"} Mar 19 09:36:14.684528 master-0 kubenswrapper[15202]: I0319 09:36:14.684528 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"16f265af329bf116e3236f9d53a7ea4085e292017ee2d8d004e3a9ba4c771ff5"} Mar 19 09:36:14.684528 master-0 kubenswrapper[15202]: I0319 09:36:14.684540 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"7600a28c1602bd08d2cdebb8e25275e0e388e64b157e417233f8719dcbdd4583"} Mar 19 09:36:14.685172 master-0 kubenswrapper[15202]: I0319 09:36:14.685038 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:36:14.718876 master-0 kubenswrapper[15202]: I0319 09:36:14.718656 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.718633007 podStartE2EDuration="2.718633007s" podCreationTimestamp="2026-03-19 09:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:36:14.717383227 +0000 UTC m=+692.102798043" watchObservedRunningTime="2026-03-19 09:36:14.718633007 +0000 UTC m=+692.104047823" Mar 19 09:36:16.811403 master-0 kubenswrapper[15202]: I0319 09:36:16.811323 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:16.837420 master-0 kubenswrapper[15202]: I0319 09:36:16.837341 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="065e933f-0bbc-4fc1-a13a-196722c83929" Mar 19 09:36:16.837420 master-0 kubenswrapper[15202]: I0319 09:36:16.837400 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="065e933f-0bbc-4fc1-a13a-196722c83929" Mar 19 09:36:16.851821 master-0 kubenswrapper[15202]: I0319 09:36:16.851752 15202 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:16.855682 master-0 kubenswrapper[15202]: I0319 09:36:16.855596 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:36:16.862627 master-0 kubenswrapper[15202]: I0319 09:36:16.862555 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:36:16.869854 master-0 kubenswrapper[15202]: I0319 09:36:16.869793 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:16.874970 master-0 kubenswrapper[15202]: I0319 09:36:16.874932 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 09:36:16.907936 master-0 kubenswrapper[15202]: W0319 09:36:16.907441 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb38ec27e482ba63d3c0762a843676a.slice/crio-200bc25cc80add1c3182ec94b1204359d4aab234bfe157f7db95c4847f0d3a2b WatchSource:0}: Error finding container 200bc25cc80add1c3182ec94b1204359d4aab234bfe157f7db95c4847f0d3a2b: Status 404 returned error can't find the container with id 200bc25cc80add1c3182ec94b1204359d4aab234bfe157f7db95c4847f0d3a2b Mar 19 09:36:17.715711 master-0 kubenswrapper[15202]: I0319 09:36:17.715615 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"053f972a41ed1639a8429e49a93129defdfc3db91a26c567e3e81ceacf9a42e0"} Mar 19 09:36:17.715892 master-0 kubenswrapper[15202]: I0319 09:36:17.715721 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"3eb2ddaa705b087b996222e4c44f06db471868ee4276507978e4fec18e59bf35"} Mar 19 09:36:17.715892 master-0 kubenswrapper[15202]: I0319 09:36:17.715742 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"200bc25cc80add1c3182ec94b1204359d4aab234bfe157f7db95c4847f0d3a2b"} Mar 19 09:36:18.728863 master-0 kubenswrapper[15202]: I0319 09:36:18.728776 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"1f9952e621a4235ca8c23f54a6dd89b8f9fa6308d1b2d074151fed2a0ed38ad0"} Mar 19 09:36:18.729636 master-0 kubenswrapper[15202]: I0319 09:36:18.729610 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"495f626e8c076015bf1f2691df69d5476ac358613f0fe0465c22f5b6b66ef4a1"} Mar 19 09:36:18.755225 master-0 kubenswrapper[15202]: I0319 09:36:18.755142 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.755118304 podStartE2EDuration="2.755118304s" podCreationTimestamp="2026-03-19 09:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:36:18.750912854 +0000 UTC m=+696.136327690" watchObservedRunningTime="2026-03-19 09:36:18.755118304 +0000 UTC m=+696.140533120" Mar 19 09:36:26.871071 master-0 kubenswrapper[15202]: I0319 09:36:26.870949 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:26.871071 master-0 kubenswrapper[15202]: I0319 09:36:26.871082 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:26.871993 master-0 kubenswrapper[15202]: I0319 09:36:26.871119 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:26.871993 master-0 kubenswrapper[15202]: I0319 09:36:26.871155 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:26.877648 master-0 kubenswrapper[15202]: I0319 09:36:26.877564 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:26.879496 master-0 kubenswrapper[15202]: I0319 09:36:26.879438 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:27.822201 master-0 kubenswrapper[15202]: I0319 09:36:27.822104 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:27.823220 master-0 kubenswrapper[15202]: I0319 09:36:27.823182 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:36:44.060788 master-0 kubenswrapper[15202]: I0319 09:36:44.060676 15202 scope.go:117] "RemoveContainer" containerID="aa9ca5f81be4d21b4000c8f0fdd07fdc216eb78f16c7cfa49b4fbe85f9057a8c" Mar 19 09:36:44.084137 master-0 kubenswrapper[15202]: I0319 09:36:44.084077 15202 scope.go:117] "RemoveContainer" containerID="3c9753a11d434b49ac1c7706c0cfb9ad45a06cf5f0dedce5c7137c69786d006a" Mar 19 09:36:44.107744 master-0 kubenswrapper[15202]: I0319 09:36:44.107663 15202 scope.go:117] "RemoveContainer" containerID="104d66d823f67f2d0db81952b3e75346a0594dd7f2e33f5fb4f808501d9d251d" Mar 19 09:36:44.731015 master-0 kubenswrapper[15202]: I0319 09:36:44.730646 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:36:44.731327 master-0 kubenswrapper[15202]: E0319 09:36:44.731052 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31aaf4-3e95-4505-9f5c-af88c1097638" containerName="installer" Mar 19 09:36:44.731327 master-0 kubenswrapper[15202]: I0319 09:36:44.731068 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31aaf4-3e95-4505-9f5c-af88c1097638" containerName="installer" Mar 19 09:36:44.731327 master-0 kubenswrapper[15202]: I0319 09:36:44.731226 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="af31aaf4-3e95-4505-9f5c-af88c1097638" containerName="installer" Mar 19 09:36:44.731787 master-0 kubenswrapper[15202]: I0319 09:36:44.731747 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:36:44.732016 master-0 kubenswrapper[15202]: I0319 09:36:44.731944 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.732145 master-0 kubenswrapper[15202]: I0319 09:36:44.732085 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" containerID="cri-o://a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120" gracePeriod=15 Mar 19 09:36:44.732212 master-0 kubenswrapper[15202]: I0319 09:36:44.732117 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f" gracePeriod=15 Mar 19 09:36:44.732277 master-0 kubenswrapper[15202]: I0319 09:36:44.732190 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381" gracePeriod=15 Mar 19 09:36:44.732277 master-0 kubenswrapper[15202]: I0319 09:36:44.732179 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1" gracePeriod=15 Mar 19 09:36:44.732485 master-0 kubenswrapper[15202]: I0319 09:36:44.732335 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494" gracePeriod=15 Mar 19 09:36:44.735829 master-0 kubenswrapper[15202]: I0319 09:36:44.735785 15202 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:36:44.736115 master-0 kubenswrapper[15202]: E0319 09:36:44.736079 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 19 09:36:44.736115 master-0 kubenswrapper[15202]: I0319 09:36:44.736097 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 19 09:36:44.736115 master-0 kubenswrapper[15202]: E0319 09:36:44.736115 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="setup" Mar 19 09:36:44.736115 master-0 kubenswrapper[15202]: I0319 09:36:44.736123 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="setup" Mar 19 09:36:44.736115 master-0 kubenswrapper[15202]: E0319 09:36:44.736131 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736142 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: E0319 09:36:44.736150 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736190 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: E0319 09:36:44.736207 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736218 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: E0319 09:36:44.736301 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736310 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: E0319 09:36:44.736340 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736347 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736555 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 09:36:44.736606 master-0 kubenswrapper[15202]: I0319 09:36:44.736588 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 19 09:36:44.737628 master-0 kubenswrapper[15202]: I0319 09:36:44.737508 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 19 09:36:44.737628 master-0 kubenswrapper[15202]: I0319 09:36:44.737539 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 19 09:36:44.737628 master-0 kubenswrapper[15202]: I0319 09:36:44.737553 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:36:44.738719 master-0 kubenswrapper[15202]: I0319 09:36:44.738051 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 09:36:44.848572 master-0 kubenswrapper[15202]: E0319 09:36:44.848438 15202 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.875030 master-0 kubenswrapper[15202]: I0319 09:36:44.874952 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.875348 master-0 kubenswrapper[15202]: I0319 09:36:44.875161 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.875348 master-0 kubenswrapper[15202]: I0319 09:36:44.875198 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.875348 master-0 kubenswrapper[15202]: I0319 09:36:44.875230 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.875348 master-0 kubenswrapper[15202]: I0319 09:36:44.875252 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.875348 master-0 kubenswrapper[15202]: I0319 09:36:44.875319 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.875599 master-0 kubenswrapper[15202]: I0319 09:36:44.875549 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.875645 master-0 kubenswrapper[15202]: I0319 09:36:44.875595 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.968135 master-0 kubenswrapper[15202]: I0319 09:36:44.968073 15202 generic.go:334] "Generic (PLEG): container finished" podID="8f348ddf-ee67-4f81-9c16-b35d1a918669" containerID="e2243249baeedf2fe21ce14cc48c1196cadeee34e720ae9084b1811ac9f8a731" exitCode=0 Mar 19 09:36:44.968401 master-0 kubenswrapper[15202]: I0319 09:36:44.968153 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"8f348ddf-ee67-4f81-9c16-b35d1a918669","Type":"ContainerDied","Data":"e2243249baeedf2fe21ce14cc48c1196cadeee34e720ae9084b1811ac9f8a731"} Mar 19 09:36:44.969126 master-0 kubenswrapper[15202]: I0319 09:36:44.969094 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:44.970653 master-0 kubenswrapper[15202]: I0319 09:36:44.970620 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-check-endpoints/0.log" Mar 19 09:36:44.971871 master-0 kubenswrapper[15202]: I0319 09:36:44.971847 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 09:36:44.972883 master-0 kubenswrapper[15202]: I0319 09:36:44.972702 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f" exitCode=0 Mar 19 09:36:44.972883 master-0 kubenswrapper[15202]: I0319 09:36:44.972721 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381" exitCode=0 Mar 19 09:36:44.972883 master-0 kubenswrapper[15202]: I0319 09:36:44.972732 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1" exitCode=0 Mar 19 09:36:44.972883 master-0 kubenswrapper[15202]: I0319 09:36:44.972744 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494" exitCode=2 Mar 19 09:36:44.972883 master-0 kubenswrapper[15202]: I0319 09:36:44.972816 15202 scope.go:117] "RemoveContainer" containerID="63d59fc46d669df5a337070a35f5b53eb2b46e5fc1749cd3250ef710c1c9446e" Mar 19 09:36:44.976349 master-0 kubenswrapper[15202]: I0319 09:36:44.976312 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976438 master-0 kubenswrapper[15202]: I0319 09:36:44.976371 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976438 master-0 kubenswrapper[15202]: I0319 09:36:44.976409 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976538 master-0 kubenswrapper[15202]: I0319 09:36:44.976442 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.976538 master-0 kubenswrapper[15202]: I0319 09:36:44.976457 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976538 master-0 kubenswrapper[15202]: I0319 09:36:44.976488 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976622 master-0 kubenswrapper[15202]: I0319 09:36:44.976584 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976705 master-0 kubenswrapper[15202]: I0319 09:36:44.976676 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976753 master-0 kubenswrapper[15202]: I0319 09:36:44.976736 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.976788 master-0 kubenswrapper[15202]: I0319 09:36:44.976771 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976855 master-0 kubenswrapper[15202]: I0319 09:36:44.976834 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976906 master-0 kubenswrapper[15202]: I0319 09:36:44.976860 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:44.976906 master-0 kubenswrapper[15202]: I0319 09:36:44.976901 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.976975 master-0 kubenswrapper[15202]: I0319 09:36:44.976882 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.977009 master-0 kubenswrapper[15202]: I0319 09:36:44.976971 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:44.977081 master-0 kubenswrapper[15202]: I0319 09:36:44.977062 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:45.149726 master-0 kubenswrapper[15202]: I0319 09:36:45.149630 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:45.170617 master-0 kubenswrapper[15202]: W0319 09:36:45.170558 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbfbf2b56df0323ba118d68bfdad8b9.slice/crio-d4ccb3ecd666cf8303ba54fad090c292ea0032667abf08bc49c16951f74715bc WatchSource:0}: Error finding container d4ccb3ecd666cf8303ba54fad090c292ea0032667abf08bc49c16951f74715bc: Status 404 returned error can't find the container with id d4ccb3ecd666cf8303ba54fad090c292ea0032667abf08bc49c16951f74715bc Mar 19 09:36:45.175183 master-0 kubenswrapper[15202]: E0319 09:36:45.175066 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e347ac7d0cf2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:ebbfbf2b56df0323ba118d68bfdad8b9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:36:45.173559086 +0000 UTC m=+722.558973902,LastTimestamp:2026-03-19 09:36:45.173559086 +0000 UTC m=+722.558973902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:36:45.996977 master-0 kubenswrapper[15202]: I0319 09:36:45.996905 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 09:36:46.000725 master-0 kubenswrapper[15202]: I0319 09:36:46.000676 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"ebbfbf2b56df0323ba118d68bfdad8b9","Type":"ContainerStarted","Data":"cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8"} Mar 19 09:36:46.000833 master-0 kubenswrapper[15202]: I0319 09:36:46.000747 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"ebbfbf2b56df0323ba118d68bfdad8b9","Type":"ContainerStarted","Data":"d4ccb3ecd666cf8303ba54fad090c292ea0032667abf08bc49c16951f74715bc"} Mar 19 09:36:46.002324 master-0 kubenswrapper[15202]: E0319 09:36:46.002287 15202 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:36:46.002384 master-0 kubenswrapper[15202]: I0319 09:36:46.002301 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:46.313725 master-0 kubenswrapper[15202]: I0319 09:36:46.313677 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:36:46.315016 master-0 kubenswrapper[15202]: I0319 09:36:46.314976 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:46.413564 master-0 kubenswrapper[15202]: I0319 09:36:46.413435 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-var-lock\") pod \"8f348ddf-ee67-4f81-9c16-b35d1a918669\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " Mar 19 09:36:46.414053 master-0 kubenswrapper[15202]: I0319 09:36:46.413602 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-var-lock" (OuterVolumeSpecName: "var-lock") pod "8f348ddf-ee67-4f81-9c16-b35d1a918669" (UID: "8f348ddf-ee67-4f81-9c16-b35d1a918669"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:46.414053 master-0 kubenswrapper[15202]: I0319 09:36:46.413664 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f348ddf-ee67-4f81-9c16-b35d1a918669-kube-api-access\") pod \"8f348ddf-ee67-4f81-9c16-b35d1a918669\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " Mar 19 09:36:46.414053 master-0 kubenswrapper[15202]: I0319 09:36:46.413880 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-kubelet-dir\") pod \"8f348ddf-ee67-4f81-9c16-b35d1a918669\" (UID: \"8f348ddf-ee67-4f81-9c16-b35d1a918669\") " Mar 19 09:36:46.414053 master-0 kubenswrapper[15202]: I0319 09:36:46.413931 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f348ddf-ee67-4f81-9c16-b35d1a918669" (UID: "8f348ddf-ee67-4f81-9c16-b35d1a918669"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:46.414608 master-0 kubenswrapper[15202]: I0319 09:36:46.414574 15202 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:46.414608 master-0 kubenswrapper[15202]: I0319 09:36:46.414608 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f348ddf-ee67-4f81-9c16-b35d1a918669-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:46.416511 master-0 kubenswrapper[15202]: I0319 09:36:46.416443 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f348ddf-ee67-4f81-9c16-b35d1a918669-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f348ddf-ee67-4f81-9c16-b35d1a918669" (UID: "8f348ddf-ee67-4f81-9c16-b35d1a918669"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:46.516512 master-0 kubenswrapper[15202]: I0319 09:36:46.516440 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f348ddf-ee67-4f81-9c16-b35d1a918669-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:47.015116 master-0 kubenswrapper[15202]: I0319 09:36:47.015040 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"8f348ddf-ee67-4f81-9c16-b35d1a918669","Type":"ContainerDied","Data":"b8f8a65533fbb387492d28abc45adacb1d94c03b5766c008e8757cddf005ddc7"} Mar 19 09:36:47.015116 master-0 kubenswrapper[15202]: I0319 09:36:47.015109 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f8a65533fbb387492d28abc45adacb1d94c03b5766c008e8757cddf005ddc7" Mar 19 09:36:47.017084 master-0 kubenswrapper[15202]: I0319 09:36:47.017034 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 09:36:47.022805 master-0 kubenswrapper[15202]: I0319 09:36:47.022744 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:47.226880 master-0 kubenswrapper[15202]: I0319 09:36:47.226810 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 09:36:47.229056 master-0 kubenswrapper[15202]: I0319 09:36:47.229016 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:47.230563 master-0 kubenswrapper[15202]: I0319 09:36:47.230453 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:47.231410 master-0 kubenswrapper[15202]: I0319 09:36:47.231322 15202 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:47.345142 master-0 kubenswrapper[15202]: I0319 09:36:47.345031 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 19 09:36:47.345142 master-0 kubenswrapper[15202]: I0319 09:36:47.345159 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 19 09:36:47.345948 master-0 kubenswrapper[15202]: I0319 09:36:47.345178 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:47.345948 master-0 kubenswrapper[15202]: I0319 09:36:47.345286 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:47.345948 master-0 kubenswrapper[15202]: I0319 09:36:47.345368 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 19 09:36:47.345948 master-0 kubenswrapper[15202]: I0319 09:36:47.345518 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:47.346115 master-0 kubenswrapper[15202]: I0319 09:36:47.346057 15202 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:47.346115 master-0 kubenswrapper[15202]: I0319 09:36:47.346086 15202 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:47.346115 master-0 kubenswrapper[15202]: I0319 09:36:47.346108 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:36:48.029122 master-0 kubenswrapper[15202]: I0319 09:36:48.029059 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 09:36:48.029778 master-0 kubenswrapper[15202]: I0319 09:36:48.029744 15202 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120" exitCode=0 Mar 19 09:36:48.029845 master-0 kubenswrapper[15202]: I0319 09:36:48.029817 15202 scope.go:117] "RemoveContainer" containerID="4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f" Mar 19 09:36:48.029992 master-0 kubenswrapper[15202]: I0319 09:36:48.029958 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:48.053226 master-0 kubenswrapper[15202]: I0319 09:36:48.053164 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:48.053820 master-0 kubenswrapper[15202]: I0319 09:36:48.053774 15202 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:48.055171 master-0 kubenswrapper[15202]: I0319 09:36:48.055133 15202 scope.go:117] "RemoveContainer" containerID="272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381" Mar 19 09:36:48.075891 master-0 kubenswrapper[15202]: I0319 09:36:48.075609 15202 scope.go:117] "RemoveContainer" containerID="0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1" Mar 19 09:36:48.097586 master-0 kubenswrapper[15202]: I0319 09:36:48.097499 15202 scope.go:117] "RemoveContainer" containerID="0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494" Mar 19 09:36:48.120896 master-0 kubenswrapper[15202]: I0319 09:36:48.120849 15202 scope.go:117] "RemoveContainer" containerID="a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120" Mar 19 09:36:48.145592 master-0 kubenswrapper[15202]: I0319 09:36:48.145516 15202 scope.go:117] "RemoveContainer" containerID="d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648" Mar 19 09:36:48.165289 master-0 kubenswrapper[15202]: I0319 09:36:48.165225 15202 scope.go:117] "RemoveContainer" containerID="4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f" Mar 19 09:36:48.165812 master-0 kubenswrapper[15202]: E0319 09:36:48.165762 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f\": container with ID starting with 4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f not found: ID does not exist" containerID="4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f" Mar 19 09:36:48.165890 master-0 kubenswrapper[15202]: I0319 09:36:48.165853 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f"} err="failed to get container status \"4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f\": rpc error: code = NotFound desc = could not find container \"4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f\": container with ID starting with 4707f01ca0fe1f9a02f5e6098e37c1f244b29d639666833c14ef1ab9e164827f not found: ID does not exist" Mar 19 09:36:48.165940 master-0 kubenswrapper[15202]: I0319 09:36:48.165889 15202 scope.go:117] "RemoveContainer" containerID="272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381" Mar 19 09:36:48.167875 master-0 kubenswrapper[15202]: E0319 09:36:48.167806 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381\": container with ID starting with 272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381 not found: ID does not exist" containerID="272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381" Mar 19 09:36:48.167947 master-0 kubenswrapper[15202]: I0319 09:36:48.167876 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381"} err="failed to get container status \"272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381\": rpc error: code = NotFound desc = could not find container \"272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381\": container with ID starting with 272161fdf03151e08b7ba1c202680f484a23a29bd50c55f926622a4faa2f3381 not found: ID does not exist" Mar 19 09:36:48.167947 master-0 kubenswrapper[15202]: I0319 09:36:48.167919 15202 scope.go:117] "RemoveContainer" containerID="0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1" Mar 19 09:36:48.168482 master-0 kubenswrapper[15202]: E0319 09:36:48.168425 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1\": container with ID starting with 0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1 not found: ID does not exist" containerID="0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1" Mar 19 09:36:48.168551 master-0 kubenswrapper[15202]: I0319 09:36:48.168496 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1"} err="failed to get container status \"0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1\": rpc error: code = NotFound desc = could not find container \"0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1\": container with ID starting with 0be3e95cec94cec1fbcfa06668c3614b1ae497eec865e91fd48fedc10c09e5e1 not found: ID does not exist" Mar 19 09:36:48.168551 master-0 kubenswrapper[15202]: I0319 09:36:48.168532 15202 scope.go:117] "RemoveContainer" containerID="0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494" Mar 19 09:36:48.169012 master-0 kubenswrapper[15202]: E0319 09:36:48.168974 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494\": container with ID starting with 0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494 not found: ID does not exist" containerID="0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494" Mar 19 09:36:48.169077 master-0 kubenswrapper[15202]: I0319 09:36:48.169013 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494"} err="failed to get container status \"0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494\": rpc error: code = NotFound desc = could not find container \"0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494\": container with ID starting with 0629d669c3679cdcec8354822020ee36d5841ff61d41d8a785d27446b9f6a494 not found: ID does not exist" Mar 19 09:36:48.169077 master-0 kubenswrapper[15202]: I0319 09:36:48.169035 15202 scope.go:117] "RemoveContainer" containerID="a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120" Mar 19 09:36:48.169877 master-0 kubenswrapper[15202]: E0319 09:36:48.169503 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120\": container with ID starting with a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120 not found: ID does not exist" containerID="a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120" Mar 19 09:36:48.169877 master-0 kubenswrapper[15202]: I0319 09:36:48.169540 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120"} err="failed to get container status \"a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120\": rpc error: code = NotFound desc = could not find container \"a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120\": container with ID starting with a8159bc00876619b7d3e41824d96ca925b532a155a7cebe504d40786f0abb120 not found: ID does not exist" Mar 19 09:36:48.169877 master-0 kubenswrapper[15202]: I0319 09:36:48.169565 15202 scope.go:117] "RemoveContainer" containerID="d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648" Mar 19 09:36:48.170405 master-0 kubenswrapper[15202]: E0319 09:36:48.170233 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648\": container with ID starting with d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648 not found: ID does not exist" containerID="d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648" Mar 19 09:36:48.170405 master-0 kubenswrapper[15202]: I0319 09:36:48.170294 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648"} err="failed to get container status \"d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648\": rpc error: code = NotFound desc = could not find container \"d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648\": container with ID starting with d6a3cc5d39b44803b27cd70c29a33859bda5115b0664e8e5c26b028d68a0f648 not found: ID does not exist" Mar 19 09:36:48.827449 master-0 kubenswrapper[15202]: I0319 09:36:48.827374 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5ce05b3d592e63f1f92202d52b9635" path="/var/lib/kubelet/pods/7d5ce05b3d592e63f1f92202d52b9635/volumes" Mar 19 09:36:49.638584 master-0 kubenswrapper[15202]: E0319 09:36:49.637184 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e347ac7d0cf2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:ebbfbf2b56df0323ba118d68bfdad8b9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:36:45.173559086 +0000 UTC m=+722.558973902,LastTimestamp:2026-03-19 09:36:45.173559086 +0000 UTC m=+722.558973902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:36:50.798017 master-0 kubenswrapper[15202]: E0319 09:36:50.797927 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:50.798762 master-0 kubenswrapper[15202]: E0319 09:36:50.798688 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:50.799641 master-0 kubenswrapper[15202]: E0319 09:36:50.799561 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:50.800394 master-0 kubenswrapper[15202]: E0319 09:36:50.800266 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:50.801113 master-0 kubenswrapper[15202]: E0319 09:36:50.801075 15202 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:50.801259 master-0 kubenswrapper[15202]: I0319 09:36:50.801130 15202 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:36:50.801772 master-0 kubenswrapper[15202]: E0319 09:36:50.801671 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 09:36:51.003381 master-0 kubenswrapper[15202]: E0319 09:36:51.003301 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 09:36:51.405707 master-0 kubenswrapper[15202]: E0319 09:36:51.405526 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 09:36:52.207960 master-0 kubenswrapper[15202]: E0319 09:36:52.207798 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 09:36:52.817655 master-0 kubenswrapper[15202]: I0319 09:36:52.817560 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:53.809445 master-0 kubenswrapper[15202]: E0319 09:36:53.809349 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 09:36:57.011576 master-0 kubenswrapper[15202]: E0319 09:36:57.011414 15202 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 09:36:58.118688 master-0 kubenswrapper[15202]: I0319 09:36:58.118622 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_beb38ec27e482ba63d3c0762a843676a/kube-controller-manager/0.log" Mar 19 09:36:58.119790 master-0 kubenswrapper[15202]: I0319 09:36:58.119708 15202 generic.go:334] "Generic (PLEG): container finished" podID="beb38ec27e482ba63d3c0762a843676a" containerID="3eb2ddaa705b087b996222e4c44f06db471868ee4276507978e4fec18e59bf35" exitCode=1 Mar 19 09:36:58.119907 master-0 kubenswrapper[15202]: I0319 09:36:58.119806 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerDied","Data":"3eb2ddaa705b087b996222e4c44f06db471868ee4276507978e4fec18e59bf35"} Mar 19 09:36:58.122705 master-0 kubenswrapper[15202]: I0319 09:36:58.121982 15202 scope.go:117] "RemoveContainer" containerID="3eb2ddaa705b087b996222e4c44f06db471868ee4276507978e4fec18e59bf35" Mar 19 09:36:58.123573 master-0 kubenswrapper[15202]: I0319 09:36:58.122821 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:58.124574 master-0 kubenswrapper[15202]: I0319 09:36:58.124454 15202 status_manager.go:851] "Failed to get status for pod" podUID="beb38ec27e482ba63d3c0762a843676a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:58.811694 master-0 kubenswrapper[15202]: I0319 09:36:58.811599 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:58.820733 master-0 kubenswrapper[15202]: I0319 09:36:58.817179 15202 status_manager.go:851] "Failed to get status for pod" podUID="beb38ec27e482ba63d3c0762a843676a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:58.822902 master-0 kubenswrapper[15202]: I0319 09:36:58.822082 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:58.846143 master-0 kubenswrapper[15202]: I0319 09:36:58.846040 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:36:58.846143 master-0 kubenswrapper[15202]: I0319 09:36:58.846127 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:36:58.847308 master-0 kubenswrapper[15202]: E0319 09:36:58.847239 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:58.853167 master-0 kubenswrapper[15202]: I0319 09:36:58.852812 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:36:58.884288 master-0 kubenswrapper[15202]: W0319 09:36:58.884219 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274c4bebf95a655851b2cf276fe43ef7.slice/crio-d997863069e88c5d301e2cd98119ae1c87e0f427eb6d31fc05265aa11bdfeb42 WatchSource:0}: Error finding container d997863069e88c5d301e2cd98119ae1c87e0f427eb6d31fc05265aa11bdfeb42: Status 404 returned error can't find the container with id d997863069e88c5d301e2cd98119ae1c87e0f427eb6d31fc05265aa11bdfeb42 Mar 19 09:36:59.132633 master-0 kubenswrapper[15202]: I0319 09:36:59.132580 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_beb38ec27e482ba63d3c0762a843676a/kube-controller-manager/0.log" Mar 19 09:36:59.134858 master-0 kubenswrapper[15202]: I0319 09:36:59.132767 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"183fcd42c45db1c3563f11bb2c7bfb861eb30e9db72cbc705f94299e820d7898"} Mar 19 09:36:59.134858 master-0 kubenswrapper[15202]: I0319 09:36:59.134369 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:59.134858 master-0 kubenswrapper[15202]: I0319 09:36:59.134559 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"d997863069e88c5d301e2cd98119ae1c87e0f427eb6d31fc05265aa11bdfeb42"} Mar 19 09:36:59.135892 master-0 kubenswrapper[15202]: I0319 09:36:59.135832 15202 status_manager.go:851] "Failed to get status for pod" podUID="beb38ec27e482ba63d3c0762a843676a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:36:59.640650 master-0 kubenswrapper[15202]: E0319 09:36:59.640327 15202 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e347ac7d0cf2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:ebbfbf2b56df0323ba118d68bfdad8b9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 09:36:45.173559086 +0000 UTC m=+722.558973902,LastTimestamp:2026-03-19 09:36:45.173559086 +0000 UTC m=+722.558973902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 09:37:00.148521 master-0 kubenswrapper[15202]: I0319 09:37:00.148235 15202 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="a531dbba5ef958f61968e5a7d57e1d6bb83d80f6985cf21326d2998412c22393" exitCode=0 Mar 19 09:37:00.149620 master-0 kubenswrapper[15202]: I0319 09:37:00.148546 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerDied","Data":"a531dbba5ef958f61968e5a7d57e1d6bb83d80f6985cf21326d2998412c22393"} Mar 19 09:37:00.150397 master-0 kubenswrapper[15202]: I0319 09:37:00.150336 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:00.150659 master-0 kubenswrapper[15202]: I0319 09:37:00.150627 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:00.151730 master-0 kubenswrapper[15202]: I0319 09:37:00.151662 15202 status_manager.go:851] "Failed to get status for pod" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:00.152031 master-0 kubenswrapper[15202]: E0319 09:37:00.151960 15202 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:00.152528 master-0 kubenswrapper[15202]: I0319 09:37:00.152419 15202 status_manager.go:851] "Failed to get status for pod" podUID="beb38ec27e482ba63d3c0762a843676a" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 09:37:01.164400 master-0 kubenswrapper[15202]: I0319 09:37:01.164274 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"4260432a4b18e068a083e08336f7185c72de3274580ea2779a467a2176566f70"} Mar 19 09:37:01.164400 master-0 kubenswrapper[15202]: I0319 09:37:01.164341 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"cac05d0f0455c10474ba08f296fcb3a7e92ae437cff75a5a560cff0d46c43342"} Mar 19 09:37:01.164400 master-0 kubenswrapper[15202]: I0319 09:37:01.164354 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"0b525ca042a3a8494f293de5fcc5012055226efbfa9bbffaf16a4958cfe9d336"} Mar 19 09:37:02.175200 master-0 kubenswrapper[15202]: I0319 09:37:02.175148 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"66923e4748d2dad6ee0b3afe85f1841e8354d5d5d382a1641b21abd97a174046"} Mar 19 09:37:02.175200 master-0 kubenswrapper[15202]: I0319 09:37:02.175201 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"9ed6f4ea6f7a2a454d5812317c20c3922c80995b5caebb5b8ceaa512b6d67873"} Mar 19 09:37:02.175901 master-0 kubenswrapper[15202]: I0319 09:37:02.175641 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:02.175901 master-0 kubenswrapper[15202]: I0319 09:37:02.175657 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:02.175901 master-0 kubenswrapper[15202]: I0319 09:37:02.175880 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:02.992852 master-0 kubenswrapper[15202]: I0319 09:37:02.992778 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 09:37:03.854029 master-0 kubenswrapper[15202]: I0319 09:37:03.853956 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:03.854029 master-0 kubenswrapper[15202]: I0319 09:37:03.854016 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:03.860156 master-0 kubenswrapper[15202]: I0319 09:37:03.860088 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:06.871158 master-0 kubenswrapper[15202]: I0319 09:37:06.871064 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:37:06.876623 master-0 kubenswrapper[15202]: I0319 09:37:06.871773 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:37:06.876623 master-0 kubenswrapper[15202]: I0319 09:37:06.871890 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:37:06.876623 master-0 kubenswrapper[15202]: I0319 09:37:06.872006 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="beb38ec27e482ba63d3c0762a843676a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:37:07.232388 master-0 kubenswrapper[15202]: I0319 09:37:07.232332 15202 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:07.583962 master-0 kubenswrapper[15202]: I0319 09:37:07.583749 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="4478155f-e1c8-42e1-8cac-2b6d6c16c81c" Mar 19 09:37:08.223896 master-0 kubenswrapper[15202]: I0319 09:37:08.223829 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:08.223896 master-0 kubenswrapper[15202]: I0319 09:37:08.223869 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:08.229086 master-0 kubenswrapper[15202]: I0319 09:37:08.227735 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:09.230891 master-0 kubenswrapper[15202]: I0319 09:37:09.230810 15202 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:09.231560 master-0 kubenswrapper[15202]: I0319 09:37:09.231540 15202 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="516bf51f-c7e4-4837-9994-41e603754099" Mar 19 09:37:12.829776 master-0 kubenswrapper[15202]: I0319 09:37:12.829670 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="4478155f-e1c8-42e1-8cac-2b6d6c16c81c" Mar 19 09:37:16.745944 master-0 kubenswrapper[15202]: I0319 09:37:16.745850 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:37:16.759271 master-0 kubenswrapper[15202]: I0319 09:37:16.759217 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:37:16.870489 master-0 kubenswrapper[15202]: I0319 09:37:16.870360 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:37:16.870489 master-0 kubenswrapper[15202]: I0319 09:37:16.870429 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="beb38ec27e482ba63d3c0762a843676a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:37:17.397373 master-0 kubenswrapper[15202]: I0319 09:37:17.397248 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-r6z7f" Mar 19 09:37:17.552064 master-0 kubenswrapper[15202]: I0319 09:37:17.551986 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 09:37:17.625397 master-0 kubenswrapper[15202]: I0319 09:37:17.624753 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:37:17.728348 master-0 kubenswrapper[15202]: I0319 09:37:17.728290 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:37:17.842031 master-0 kubenswrapper[15202]: I0319 09:37:17.841943 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:37:17.863380 master-0 kubenswrapper[15202]: I0319 09:37:17.863272 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:37:17.962340 master-0 kubenswrapper[15202]: I0319 09:37:17.962200 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:37:18.039211 master-0 kubenswrapper[15202]: I0319 09:37:18.039033 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:37:18.041681 master-0 kubenswrapper[15202]: I0319 09:37:18.041644 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:37:18.110283 master-0 kubenswrapper[15202]: I0319 09:37:18.110190 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:37:18.161247 master-0 kubenswrapper[15202]: I0319 09:37:18.161168 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:37:18.205461 master-0 kubenswrapper[15202]: I0319 09:37:18.205405 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:37:18.337763 master-0 kubenswrapper[15202]: I0319 09:37:18.337574 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:37:18.434931 master-0 kubenswrapper[15202]: I0319 09:37:18.434823 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:37:18.625308 master-0 kubenswrapper[15202]: I0319 09:37:18.625103 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:37:18.674858 master-0 kubenswrapper[15202]: I0319 09:37:18.674773 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:37:18.725980 master-0 kubenswrapper[15202]: I0319 09:37:18.725885 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:37:18.809038 master-0 kubenswrapper[15202]: I0319 09:37:18.808936 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:37:18.935425 master-0 kubenswrapper[15202]: I0319 09:37:18.935338 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 09:37:18.947641 master-0 kubenswrapper[15202]: I0319 09:37:18.947345 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:37:19.246619 master-0 kubenswrapper[15202]: I0319 09:37:19.246406 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:37:19.447749 master-0 kubenswrapper[15202]: I0319 09:37:19.447654 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:37:19.522375 master-0 kubenswrapper[15202]: I0319 09:37:19.522155 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:37:19.531441 master-0 kubenswrapper[15202]: I0319 09:37:19.531374 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:37:19.602184 master-0 kubenswrapper[15202]: I0319 09:37:19.602094 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-l9t78" Mar 19 09:37:19.823661 master-0 kubenswrapper[15202]: I0319 09:37:19.823497 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:37:19.959995 master-0 kubenswrapper[15202]: I0319 09:37:19.959921 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:37:20.049100 master-0 kubenswrapper[15202]: I0319 09:37:20.049002 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 09:37:20.073258 master-0 kubenswrapper[15202]: I0319 09:37:20.073200 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:37:20.087717 master-0 kubenswrapper[15202]: I0319 09:37:20.087564 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:37:20.107310 master-0 kubenswrapper[15202]: I0319 09:37:20.107159 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:37:20.171331 master-0 kubenswrapper[15202]: I0319 09:37:20.171290 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:37:20.272773 master-0 kubenswrapper[15202]: I0319 09:37:20.272687 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:37:20.307738 master-0 kubenswrapper[15202]: I0319 09:37:20.307675 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:37:20.378812 master-0 kubenswrapper[15202]: I0319 09:37:20.378643 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 09:37:20.397371 master-0 kubenswrapper[15202]: I0319 09:37:20.397313 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:37:20.419745 master-0 kubenswrapper[15202]: I0319 09:37:20.419685 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:37:20.434737 master-0 kubenswrapper[15202]: I0319 09:37:20.434699 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:37:20.452678 master-0 kubenswrapper[15202]: I0319 09:37:20.452641 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:37:20.494838 master-0 kubenswrapper[15202]: I0319 09:37:20.494779 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:37:20.776980 master-0 kubenswrapper[15202]: I0319 09:37:20.776894 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:37:20.803335 master-0 kubenswrapper[15202]: I0319 09:37:20.803264 15202 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:37:20.809893 master-0 kubenswrapper[15202]: I0319 09:37:20.809833 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:37:20.809893 master-0 kubenswrapper[15202]: I0319 09:37:20.809893 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 09:37:20.822932 master-0 kubenswrapper[15202]: I0319 09:37:20.822836 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 09:37:20.832269 master-0 kubenswrapper[15202]: I0319 09:37:20.832152 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=13.832134209 podStartE2EDuration="13.832134209s" podCreationTimestamp="2026-03-19 09:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:37:20.830899529 +0000 UTC m=+758.216314345" watchObservedRunningTime="2026-03-19 09:37:20.832134209 +0000 UTC m=+758.217549025" Mar 19 09:37:20.906908 master-0 kubenswrapper[15202]: I0319 09:37:20.906863 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:37:20.923517 master-0 kubenswrapper[15202]: I0319 09:37:20.923442 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-bgq5z" Mar 19 09:37:20.934689 master-0 kubenswrapper[15202]: I0319 09:37:20.934640 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 09:37:20.940394 master-0 kubenswrapper[15202]: I0319 09:37:20.940333 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:37:20.974600 master-0 kubenswrapper[15202]: I0319 09:37:20.974529 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:37:21.212137 master-0 kubenswrapper[15202]: I0319 09:37:21.212043 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 09:37:21.219442 master-0 kubenswrapper[15202]: I0319 09:37:21.219391 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 09:37:21.225501 master-0 kubenswrapper[15202]: I0319 09:37:21.225443 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:37:21.250380 master-0 kubenswrapper[15202]: I0319 09:37:21.250321 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 09:37:21.257022 master-0 kubenswrapper[15202]: I0319 09:37:21.256993 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:37:21.374297 master-0 kubenswrapper[15202]: I0319 09:37:21.374247 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:37:21.410955 master-0 kubenswrapper[15202]: I0319 09:37:21.410888 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:37:21.454095 master-0 kubenswrapper[15202]: I0319 09:37:21.454019 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:37:21.484782 master-0 kubenswrapper[15202]: I0319 09:37:21.484685 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 09:37:21.501400 master-0 kubenswrapper[15202]: I0319 09:37:21.501357 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:37:21.501852 master-0 kubenswrapper[15202]: I0319 09:37:21.501820 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:37:21.615125 master-0 kubenswrapper[15202]: I0319 09:37:21.615054 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:37:21.623852 master-0 kubenswrapper[15202]: I0319 09:37:21.623813 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:37:21.644440 master-0 kubenswrapper[15202]: I0319 09:37:21.644397 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:37:21.760691 master-0 kubenswrapper[15202]: I0319 09:37:21.760586 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:37:21.829345 master-0 kubenswrapper[15202]: I0319 09:37:21.829293 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:37:21.829918 master-0 kubenswrapper[15202]: I0319 09:37:21.829846 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 09:37:21.866722 master-0 kubenswrapper[15202]: I0319 09:37:21.866651 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:37:21.958903 master-0 kubenswrapper[15202]: I0319 09:37:21.958833 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:37:22.032634 master-0 kubenswrapper[15202]: I0319 09:37:22.032507 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:37:22.123816 master-0 kubenswrapper[15202]: I0319 09:37:22.123745 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:37:22.178651 master-0 kubenswrapper[15202]: I0319 09:37:22.178585 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 09:37:22.201984 master-0 kubenswrapper[15202]: I0319 09:37:22.201925 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:37:22.219481 master-0 kubenswrapper[15202]: I0319 09:37:22.219419 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-t6dfg" Mar 19 09:37:22.227434 master-0 kubenswrapper[15202]: I0319 09:37:22.227381 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:37:22.307567 master-0 kubenswrapper[15202]: I0319 09:37:22.307442 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-5339u6k6jn3h3" Mar 19 09:37:22.355093 master-0 kubenswrapper[15202]: I0319 09:37:22.355044 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:37:22.424055 master-0 kubenswrapper[15202]: I0319 09:37:22.424002 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:37:22.465717 master-0 kubenswrapper[15202]: I0319 09:37:22.465656 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:37:22.552441 master-0 kubenswrapper[15202]: I0319 09:37:22.552383 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:37:22.575042 master-0 kubenswrapper[15202]: I0319 09:37:22.574908 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-j66zv" Mar 19 09:37:22.576866 master-0 kubenswrapper[15202]: I0319 09:37:22.576807 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:37:22.592582 master-0 kubenswrapper[15202]: I0319 09:37:22.592529 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:37:22.686030 master-0 kubenswrapper[15202]: I0319 09:37:22.685949 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:37:22.714237 master-0 kubenswrapper[15202]: I0319 09:37:22.714173 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:37:22.714542 master-0 kubenswrapper[15202]: I0319 09:37:22.714244 15202 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:37:22.727551 master-0 kubenswrapper[15202]: I0319 09:37:22.727459 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:37:22.784745 master-0 kubenswrapper[15202]: I0319 09:37:22.784700 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:37:22.800620 master-0 kubenswrapper[15202]: I0319 09:37:22.800539 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:37:22.832163 master-0 kubenswrapper[15202]: I0319 09:37:22.832041 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wkbj2" Mar 19 09:37:22.958617 master-0 kubenswrapper[15202]: I0319 09:37:22.958453 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:37:23.099910 master-0 kubenswrapper[15202]: I0319 09:37:23.099697 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-j956x" Mar 19 09:37:23.249809 master-0 kubenswrapper[15202]: I0319 09:37:23.249722 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:37:23.315116 master-0 kubenswrapper[15202]: I0319 09:37:23.315059 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 09:37:23.340629 master-0 kubenswrapper[15202]: I0319 09:37:23.340554 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:37:23.475055 master-0 kubenswrapper[15202]: I0319 09:37:23.474975 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:37:23.493783 master-0 kubenswrapper[15202]: I0319 09:37:23.493696 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:37:23.531336 master-0 kubenswrapper[15202]: I0319 09:37:23.531274 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 09:37:23.553809 master-0 kubenswrapper[15202]: I0319 09:37:23.553741 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:37:23.576241 master-0 kubenswrapper[15202]: I0319 09:37:23.576175 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:37:23.583297 master-0 kubenswrapper[15202]: I0319 09:37:23.583241 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:37:23.608652 master-0 kubenswrapper[15202]: I0319 09:37:23.608589 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:37:23.628917 master-0 kubenswrapper[15202]: I0319 09:37:23.628872 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:37:23.650626 master-0 kubenswrapper[15202]: I0319 09:37:23.650557 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:37:23.720775 master-0 kubenswrapper[15202]: I0319 09:37:23.720735 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:37:23.750392 master-0 kubenswrapper[15202]: I0319 09:37:23.748175 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 09:37:23.759328 master-0 kubenswrapper[15202]: I0319 09:37:23.759269 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:37:23.759721 master-0 kubenswrapper[15202]: I0319 09:37:23.759692 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:37:23.763336 master-0 kubenswrapper[15202]: I0319 09:37:23.763273 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:37:23.763753 master-0 kubenswrapper[15202]: I0319 09:37:23.763553 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:37:23.787893 master-0 kubenswrapper[15202]: I0319 09:37:23.787833 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:37:23.820917 master-0 kubenswrapper[15202]: I0319 09:37:23.820843 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-mdr74" Mar 19 09:37:23.834510 master-0 kubenswrapper[15202]: I0319 09:37:23.834435 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-bvdqs" Mar 19 09:37:23.840576 master-0 kubenswrapper[15202]: I0319 09:37:23.840535 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:37:23.844150 master-0 kubenswrapper[15202]: I0319 09:37:23.844102 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:37:23.856104 master-0 kubenswrapper[15202]: I0319 09:37:23.856058 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:37:23.894234 master-0 kubenswrapper[15202]: I0319 09:37:23.894144 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 09:37:23.920859 master-0 kubenswrapper[15202]: I0319 09:37:23.920785 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:37:24.048647 master-0 kubenswrapper[15202]: I0319 09:37:24.048506 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:37:24.247288 master-0 kubenswrapper[15202]: I0319 09:37:24.247233 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:37:24.247288 master-0 kubenswrapper[15202]: I0319 09:37:24.247247 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:37:24.276744 master-0 kubenswrapper[15202]: I0319 09:37:24.276672 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:37:24.283648 master-0 kubenswrapper[15202]: I0319 09:37:24.283594 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:37:24.306939 master-0 kubenswrapper[15202]: I0319 09:37:24.306738 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:37:24.468857 master-0 kubenswrapper[15202]: I0319 09:37:24.468785 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:37:24.500686 master-0 kubenswrapper[15202]: I0319 09:37:24.500623 15202 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:37:24.501866 master-0 kubenswrapper[15202]: I0319 09:37:24.501826 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:37:24.539116 master-0 kubenswrapper[15202]: I0319 09:37:24.539057 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 09:37:24.543714 master-0 kubenswrapper[15202]: I0319 09:37:24.543672 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:37:24.545501 master-0 kubenswrapper[15202]: I0319 09:37:24.545454 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:37:24.578536 master-0 kubenswrapper[15202]: I0319 09:37:24.578400 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:37:24.590588 master-0 kubenswrapper[15202]: I0319 09:37:24.590515 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 09:37:24.613079 master-0 kubenswrapper[15202]: I0319 09:37:24.613036 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:37:24.661435 master-0 kubenswrapper[15202]: I0319 09:37:24.661364 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:37:24.708430 master-0 kubenswrapper[15202]: I0319 09:37:24.708347 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 09:37:24.793623 master-0 kubenswrapper[15202]: I0319 09:37:24.793530 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 09:37:24.924945 master-0 kubenswrapper[15202]: I0319 09:37:24.924789 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:37:24.950425 master-0 kubenswrapper[15202]: I0319 09:37:24.950362 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:37:25.111021 master-0 kubenswrapper[15202]: I0319 09:37:25.110938 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:37:25.207751 master-0 kubenswrapper[15202]: I0319 09:37:25.207664 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:37:25.222046 master-0 kubenswrapper[15202]: I0319 09:37:25.221969 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-4i3vpe46p0rrq" Mar 19 09:37:25.233300 master-0 kubenswrapper[15202]: I0319 09:37:25.233240 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:37:25.249084 master-0 kubenswrapper[15202]: I0319 09:37:25.248947 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:37:25.305847 master-0 kubenswrapper[15202]: I0319 09:37:25.305749 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-6s584" Mar 19 09:37:25.343566 master-0 kubenswrapper[15202]: I0319 09:37:25.343237 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:37:25.402138 master-0 kubenswrapper[15202]: I0319 09:37:25.402090 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:37:25.432414 master-0 kubenswrapper[15202]: I0319 09:37:25.432370 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:37:25.505714 master-0 kubenswrapper[15202]: I0319 09:37:25.505595 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:37:25.532779 master-0 kubenswrapper[15202]: I0319 09:37:25.532712 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 09:37:25.546615 master-0 kubenswrapper[15202]: I0319 09:37:25.546450 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:37:25.553701 master-0 kubenswrapper[15202]: I0319 09:37:25.553663 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:37:25.575367 master-0 kubenswrapper[15202]: I0319 09:37:25.575324 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:37:25.622798 master-0 kubenswrapper[15202]: I0319 09:37:25.622731 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:37:25.639889 master-0 kubenswrapper[15202]: I0319 09:37:25.639801 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 09:37:25.675914 master-0 kubenswrapper[15202]: I0319 09:37:25.675787 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:37:25.677161 master-0 kubenswrapper[15202]: I0319 09:37:25.677075 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:37:25.718291 master-0 kubenswrapper[15202]: I0319 09:37:25.717996 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:37:25.730161 master-0 kubenswrapper[15202]: I0319 09:37:25.730114 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:37:25.831863 master-0 kubenswrapper[15202]: I0319 09:37:25.831672 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:37:25.861445 master-0 kubenswrapper[15202]: I0319 09:37:25.861375 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:37:25.897857 master-0 kubenswrapper[15202]: I0319 09:37:25.897798 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 09:37:25.912429 master-0 kubenswrapper[15202]: I0319 09:37:25.912353 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 09:37:26.021452 master-0 kubenswrapper[15202]: I0319 09:37:26.021377 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 09:37:26.068715 master-0 kubenswrapper[15202]: I0319 09:37:26.068646 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:37:26.198114 master-0 kubenswrapper[15202]: I0319 09:37:26.198070 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:37:26.224637 master-0 kubenswrapper[15202]: I0319 09:37:26.224555 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:37:26.225039 master-0 kubenswrapper[15202]: I0319 09:37:26.225009 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-llwk7" Mar 19 09:37:26.256148 master-0 kubenswrapper[15202]: I0319 09:37:26.256092 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:37:26.271595 master-0 kubenswrapper[15202]: I0319 09:37:26.271545 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:37:26.290076 master-0 kubenswrapper[15202]: I0319 09:37:26.290020 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:37:26.335216 master-0 kubenswrapper[15202]: I0319 09:37:26.335158 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:37:26.377357 master-0 kubenswrapper[15202]: I0319 09:37:26.377299 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:37:26.425586 master-0 kubenswrapper[15202]: I0319 09:37:26.425542 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 09:37:26.476096 master-0 kubenswrapper[15202]: I0319 09:37:26.475974 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 09:37:26.482543 master-0 kubenswrapper[15202]: I0319 09:37:26.482511 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:37:26.526998 master-0 kubenswrapper[15202]: I0319 09:37:26.526949 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:37:26.681578 master-0 kubenswrapper[15202]: I0319 09:37:26.681531 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:37:26.755283 master-0 kubenswrapper[15202]: I0319 09:37:26.755158 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:37:26.787635 master-0 kubenswrapper[15202]: I0319 09:37:26.787591 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:37:26.870989 master-0 kubenswrapper[15202]: I0319 09:37:26.870927 15202 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 09:37:26.871333 master-0 kubenswrapper[15202]: I0319 09:37:26.871296 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="beb38ec27e482ba63d3c0762a843676a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 09:37:26.871461 master-0 kubenswrapper[15202]: I0319 09:37:26.871446 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:37:26.872378 master-0 kubenswrapper[15202]: I0319 09:37:26.872352 15202 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"183fcd42c45db1c3563f11bb2c7bfb861eb30e9db72cbc705f94299e820d7898"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 09:37:26.872624 master-0 kubenswrapper[15202]: I0319 09:37:26.872599 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="beb38ec27e482ba63d3c0762a843676a" containerName="kube-controller-manager" containerID="cri-o://183fcd42c45db1c3563f11bb2c7bfb861eb30e9db72cbc705f94299e820d7898" gracePeriod=30 Mar 19 09:37:26.899045 master-0 kubenswrapper[15202]: I0319 09:37:26.898995 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:37:26.899898 master-0 kubenswrapper[15202]: I0319 09:37:26.899875 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:37:26.901904 master-0 kubenswrapper[15202]: I0319 09:37:26.901867 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:37:26.918173 master-0 kubenswrapper[15202]: I0319 09:37:26.918107 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:37:26.971644 master-0 kubenswrapper[15202]: I0319 09:37:26.971593 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:37:26.984037 master-0 kubenswrapper[15202]: I0319 09:37:26.983977 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:37:27.005395 master-0 kubenswrapper[15202]: I0319 09:37:27.005270 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:37:27.074529 master-0 kubenswrapper[15202]: I0319 09:37:27.074451 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:37:27.201145 master-0 kubenswrapper[15202]: I0319 09:37:27.201072 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-17lh7pj6890g7" Mar 19 09:37:27.448973 master-0 kubenswrapper[15202]: I0319 09:37:27.448900 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 09:37:27.491520 master-0 kubenswrapper[15202]: I0319 09:37:27.491409 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:37:27.493386 master-0 kubenswrapper[15202]: I0319 09:37:27.493349 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 09:37:27.555231 master-0 kubenswrapper[15202]: I0319 09:37:27.555051 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:37:27.636304 master-0 kubenswrapper[15202]: I0319 09:37:27.636242 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:37:27.640647 master-0 kubenswrapper[15202]: I0319 09:37:27.640450 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:37:27.695616 master-0 kubenswrapper[15202]: I0319 09:37:27.695526 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:37:27.716373 master-0 kubenswrapper[15202]: I0319 09:37:27.716242 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:37:27.747802 master-0 kubenswrapper[15202]: I0319 09:37:27.747744 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-jtdpn" Mar 19 09:37:27.754794 master-0 kubenswrapper[15202]: I0319 09:37:27.754745 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:37:27.760126 master-0 kubenswrapper[15202]: I0319 09:37:27.760054 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:37:27.778592 master-0 kubenswrapper[15202]: I0319 09:37:27.778506 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:37:27.800745 master-0 kubenswrapper[15202]: I0319 09:37:27.800667 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:37:27.802287 master-0 kubenswrapper[15202]: I0319 09:37:27.802230 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:37:27.848197 master-0 kubenswrapper[15202]: I0319 09:37:27.848115 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:37:27.990782 master-0 kubenswrapper[15202]: I0319 09:37:27.990644 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:37:28.007330 master-0 kubenswrapper[15202]: I0319 09:37:28.007278 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xcbjl" Mar 19 09:37:28.033341 master-0 kubenswrapper[15202]: I0319 09:37:28.033251 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:37:28.094046 master-0 kubenswrapper[15202]: I0319 09:37:28.093926 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:37:28.215250 master-0 kubenswrapper[15202]: I0319 09:37:28.215168 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:37:28.223794 master-0 kubenswrapper[15202]: I0319 09:37:28.223734 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:37:28.228667 master-0 kubenswrapper[15202]: I0319 09:37:28.228607 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:37:28.240417 master-0 kubenswrapper[15202]: I0319 09:37:28.240340 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:37:28.294880 master-0 kubenswrapper[15202]: I0319 09:37:28.294734 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:37:28.295122 master-0 kubenswrapper[15202]: I0319 09:37:28.295030 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:37:28.321548 master-0 kubenswrapper[15202]: I0319 09:37:28.321499 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 09:37:28.424448 master-0 kubenswrapper[15202]: I0319 09:37:28.424367 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:37:28.487727 master-0 kubenswrapper[15202]: I0319 09:37:28.487642 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 09:37:28.531002 master-0 kubenswrapper[15202]: I0319 09:37:28.530953 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:37:28.588814 master-0 kubenswrapper[15202]: I0319 09:37:28.588644 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 09:37:28.601932 master-0 kubenswrapper[15202]: I0319 09:37:28.601876 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 09:37:28.604439 master-0 kubenswrapper[15202]: I0319 09:37:28.604406 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zxmm6" Mar 19 09:37:28.616256 master-0 kubenswrapper[15202]: I0319 09:37:28.616200 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:37:28.722328 master-0 kubenswrapper[15202]: I0319 09:37:28.722246 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:37:28.797019 master-0 kubenswrapper[15202]: I0319 09:37:28.796933 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:37:28.940051 master-0 kubenswrapper[15202]: I0319 09:37:28.939983 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:37:28.976206 master-0 kubenswrapper[15202]: I0319 09:37:28.976129 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:37:29.035850 master-0 kubenswrapper[15202]: I0319 09:37:29.035758 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xvfqf" Mar 19 09:37:29.120536 master-0 kubenswrapper[15202]: I0319 09:37:29.120395 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:37:29.136253 master-0 kubenswrapper[15202]: I0319 09:37:29.136185 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 09:37:29.139424 master-0 kubenswrapper[15202]: I0319 09:37:29.139389 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:37:29.146810 master-0 kubenswrapper[15202]: I0319 09:37:29.146771 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:37:29.234337 master-0 kubenswrapper[15202]: I0319 09:37:29.234184 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:37:29.382862 master-0 kubenswrapper[15202]: I0319 09:37:29.382801 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:37:29.430485 master-0 kubenswrapper[15202]: I0319 09:37:29.430409 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:37:29.483832 master-0 kubenswrapper[15202]: I0319 09:37:29.483780 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:37:29.667870 master-0 kubenswrapper[15202]: I0319 09:37:29.667700 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:37:29.737648 master-0 kubenswrapper[15202]: I0319 09:37:29.737574 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:37:29.750710 master-0 kubenswrapper[15202]: I0319 09:37:29.750655 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:37:29.803421 master-0 kubenswrapper[15202]: I0319 09:37:29.803357 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:37:29.828883 master-0 kubenswrapper[15202]: I0319 09:37:29.828771 15202 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 09:37:29.829185 master-0 kubenswrapper[15202]: I0319 09:37:29.829135 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" containerID="cri-o://cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8" gracePeriod=5 Mar 19 09:37:29.850121 master-0 kubenswrapper[15202]: I0319 09:37:29.850052 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:37:29.901932 master-0 kubenswrapper[15202]: I0319 09:37:29.901856 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:37:29.945636 master-0 kubenswrapper[15202]: I0319 09:37:29.945577 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 09:37:30.108567 master-0 kubenswrapper[15202]: I0319 09:37:30.108499 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:37:30.180723 master-0 kubenswrapper[15202]: I0319 09:37:30.180634 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:37:30.292284 master-0 kubenswrapper[15202]: I0319 09:37:30.292132 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:37:30.296526 master-0 kubenswrapper[15202]: I0319 09:37:30.296494 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:37:30.383761 master-0 kubenswrapper[15202]: I0319 09:37:30.383683 15202 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:37:30.524054 master-0 kubenswrapper[15202]: I0319 09:37:30.523997 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:37:30.743354 master-0 kubenswrapper[15202]: I0319 09:37:30.743303 15202 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:37:30.934539 master-0 kubenswrapper[15202]: I0319 09:37:30.934483 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-dtscf" Mar 19 09:37:30.943722 master-0 kubenswrapper[15202]: I0319 09:37:30.943679 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-jwt9n" Mar 19 09:37:30.984550 master-0 kubenswrapper[15202]: I0319 09:37:30.984489 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:37:31.119017 master-0 kubenswrapper[15202]: I0319 09:37:31.118883 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:37:31.144272 master-0 kubenswrapper[15202]: I0319 09:37:31.144221 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 09:37:31.149412 master-0 kubenswrapper[15202]: I0319 09:37:31.149377 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:37:31.209196 master-0 kubenswrapper[15202]: I0319 09:37:31.209049 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:37:31.248663 master-0 kubenswrapper[15202]: I0319 09:37:31.248588 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:37:31.252202 master-0 kubenswrapper[15202]: I0319 09:37:31.252160 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:37:31.260653 master-0 kubenswrapper[15202]: I0319 09:37:31.260611 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 09:37:31.311633 master-0 kubenswrapper[15202]: I0319 09:37:31.311574 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:37:31.444100 master-0 kubenswrapper[15202]: I0319 09:37:31.444032 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:37:31.463999 master-0 kubenswrapper[15202]: I0319 09:37:31.463922 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jvr7z" Mar 19 09:37:31.678749 master-0 kubenswrapper[15202]: I0319 09:37:31.678678 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 09:37:31.796024 master-0 kubenswrapper[15202]: I0319 09:37:31.795844 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:37:31.845551 master-0 kubenswrapper[15202]: I0319 09:37:31.845441 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-lvzxr" Mar 19 09:37:31.899365 master-0 kubenswrapper[15202]: I0319 09:37:31.899295 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:37:31.921188 master-0 kubenswrapper[15202]: I0319 09:37:31.921095 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 09:37:31.930760 master-0 kubenswrapper[15202]: I0319 09:37:31.930686 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:37:32.075423 master-0 kubenswrapper[15202]: I0319 09:37:32.075284 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:37:32.204422 master-0 kubenswrapper[15202]: I0319 09:37:32.204350 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:37:32.243401 master-0 kubenswrapper[15202]: I0319 09:37:32.243320 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:37:32.278229 master-0 kubenswrapper[15202]: I0319 09:37:32.278178 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 09:37:32.418411 master-0 kubenswrapper[15202]: I0319 09:37:32.418247 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-zsf7l" Mar 19 09:37:32.476917 master-0 kubenswrapper[15202]: I0319 09:37:32.476861 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:37:32.571223 master-0 kubenswrapper[15202]: I0319 09:37:32.571144 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:37:32.686720 master-0 kubenswrapper[15202]: I0319 09:37:32.686640 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:37:32.709754 master-0 kubenswrapper[15202]: I0319 09:37:32.709683 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 09:37:33.048869 master-0 kubenswrapper[15202]: I0319 09:37:33.048683 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:37:33.229651 master-0 kubenswrapper[15202]: I0319 09:37:33.229587 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vp2s5" Mar 19 09:37:33.258110 master-0 kubenswrapper[15202]: I0319 09:37:33.258044 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 09:37:33.281407 master-0 kubenswrapper[15202]: I0319 09:37:33.281320 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:37:33.436710 master-0 kubenswrapper[15202]: I0319 09:37:33.436644 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:37:34.571353 master-0 kubenswrapper[15202]: I0319 09:37:34.571274 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:37:35.415262 master-0 kubenswrapper[15202]: I0319 09:37:35.415189 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_ebbfbf2b56df0323ba118d68bfdad8b9/startup-monitor/0.log" Mar 19 09:37:35.415262 master-0 kubenswrapper[15202]: I0319 09:37:35.415269 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:35.428517 master-0 kubenswrapper[15202]: I0319 09:37:35.428371 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_ebbfbf2b56df0323ba118d68bfdad8b9/startup-monitor/0.log" Mar 19 09:37:35.428985 master-0 kubenswrapper[15202]: I0319 09:37:35.428545 15202 generic.go:334] "Generic (PLEG): container finished" podID="ebbfbf2b56df0323ba118d68bfdad8b9" containerID="cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8" exitCode=137 Mar 19 09:37:35.428985 master-0 kubenswrapper[15202]: I0319 09:37:35.428634 15202 scope.go:117] "RemoveContainer" containerID="cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8" Mar 19 09:37:35.428985 master-0 kubenswrapper[15202]: I0319 09:37:35.428682 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 09:37:35.449403 master-0 kubenswrapper[15202]: I0319 09:37:35.449352 15202 scope.go:117] "RemoveContainer" containerID="cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8" Mar 19 09:37:35.450295 master-0 kubenswrapper[15202]: E0319 09:37:35.450199 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8\": container with ID starting with cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8 not found: ID does not exist" containerID="cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8" Mar 19 09:37:35.450414 master-0 kubenswrapper[15202]: I0319 09:37:35.450302 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8"} err="failed to get container status \"cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8\": rpc error: code = NotFound desc = could not find container \"cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8\": container with ID starting with cbc452a22c564a3b1d5bba68d7eaab6dc49d3d6d6bbdb8b2282cd572d3ecdeb8 not found: ID does not exist" Mar 19 09:37:35.484982 master-0 kubenswrapper[15202]: I0319 09:37:35.484837 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:37:35.485440 master-0 kubenswrapper[15202]: I0319 09:37:35.485050 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:37:35.485440 master-0 kubenswrapper[15202]: I0319 09:37:35.485244 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:37:35.485440 master-0 kubenswrapper[15202]: I0319 09:37:35.485286 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:35.485440 master-0 kubenswrapper[15202]: I0319 09:37:35.485361 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:37:35.485908 master-0 kubenswrapper[15202]: I0319 09:37:35.485400 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests" (OuterVolumeSpecName: "manifests") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:35.485908 master-0 kubenswrapper[15202]: I0319 09:37:35.485452 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 09:37:35.485908 master-0 kubenswrapper[15202]: I0319 09:37:35.485627 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log" (OuterVolumeSpecName: "var-log") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:35.486601 master-0 kubenswrapper[15202]: I0319 09:37:35.485265 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock" (OuterVolumeSpecName: "var-lock") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:35.486601 master-0 kubenswrapper[15202]: I0319 09:37:35.486571 15202 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:35.486812 master-0 kubenswrapper[15202]: I0319 09:37:35.486603 15202 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:35.486812 master-0 kubenswrapper[15202]: I0319 09:37:35.486626 15202 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:35.491390 master-0 kubenswrapper[15202]: I0319 09:37:35.491281 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:37:35.588164 master-0 kubenswrapper[15202]: I0319 09:37:35.588065 15202 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:35.588164 master-0 kubenswrapper[15202]: I0319 09:37:35.588123 15202 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 09:37:36.827983 master-0 kubenswrapper[15202]: I0319 09:37:36.827917 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" path="/var/lib/kubelet/pods/ebbfbf2b56df0323ba118d68bfdad8b9/volumes" Mar 19 09:37:49.158453 master-0 kubenswrapper[15202]: I0319 09:37:49.158361 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 09:37:53.096150 master-0 kubenswrapper[15202]: I0319 09:37:53.096074 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 09:37:54.251444 master-0 kubenswrapper[15202]: I0319 09:37:54.251378 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-pr68p" Mar 19 09:37:57.616930 master-0 kubenswrapper[15202]: I0319 09:37:57.616873 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_beb38ec27e482ba63d3c0762a843676a/kube-controller-manager/1.log" Mar 19 09:37:57.618501 master-0 kubenswrapper[15202]: I0319 09:37:57.618443 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_beb38ec27e482ba63d3c0762a843676a/kube-controller-manager/0.log" Mar 19 09:37:57.618630 master-0 kubenswrapper[15202]: I0319 09:37:57.618542 15202 generic.go:334] "Generic (PLEG): container finished" podID="beb38ec27e482ba63d3c0762a843676a" containerID="183fcd42c45db1c3563f11bb2c7bfb861eb30e9db72cbc705f94299e820d7898" exitCode=137 Mar 19 09:37:57.618630 master-0 kubenswrapper[15202]: I0319 09:37:57.618587 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerDied","Data":"183fcd42c45db1c3563f11bb2c7bfb861eb30e9db72cbc705f94299e820d7898"} Mar 19 09:37:57.618630 master-0 kubenswrapper[15202]: I0319 09:37:57.618620 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"beb38ec27e482ba63d3c0762a843676a","Type":"ContainerStarted","Data":"35e5040d8963c00aea5c5abdfd902dc4754403ae77cbabf5972d0930bc941628"} Mar 19 09:37:57.618945 master-0 kubenswrapper[15202]: I0319 09:37:57.618647 15202 scope.go:117] "RemoveContainer" containerID="3eb2ddaa705b087b996222e4c44f06db471868ee4276507978e4fec18e59bf35" Mar 19 09:37:58.628766 master-0 kubenswrapper[15202]: I0319 09:37:58.628716 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_beb38ec27e482ba63d3c0762a843676a/kube-controller-manager/1.log" Mar 19 09:38:06.871987 master-0 kubenswrapper[15202]: I0319 09:38:06.871916 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:06.872971 master-0 kubenswrapper[15202]: I0319 09:38:06.872927 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:06.877781 master-0 kubenswrapper[15202]: I0319 09:38:06.877736 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:07.695130 master-0 kubenswrapper[15202]: I0319 09:38:07.695058 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 09:38:08.010899 master-0 kubenswrapper[15202]: I0319 09:38:08.010741 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 09:39:11.218072 master-0 kubenswrapper[15202]: I0319 09:39:11.217950 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57bfdb854-c5vtx"] Mar 19 09:39:11.219578 master-0 kubenswrapper[15202]: E0319 09:39:11.218820 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" containerName="installer" Mar 19 09:39:11.219578 master-0 kubenswrapper[15202]: I0319 09:39:11.218859 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" containerName="installer" Mar 19 09:39:11.219578 master-0 kubenswrapper[15202]: E0319 09:39:11.218902 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 09:39:11.219578 master-0 kubenswrapper[15202]: I0319 09:39:11.218922 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 09:39:11.219578 master-0 kubenswrapper[15202]: I0319 09:39:11.219205 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f348ddf-ee67-4f81-9c16-b35d1a918669" containerName="installer" Mar 19 09:39:11.219578 master-0 kubenswrapper[15202]: I0319 09:39:11.219274 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 09:39:11.223523 master-0 kubenswrapper[15202]: I0319 09:39:11.220291 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.231700 master-0 kubenswrapper[15202]: I0319 09:39:11.231613 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:39:11.232028 master-0 kubenswrapper[15202]: I0319 09:39:11.231767 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-57xnh" Mar 19 09:39:11.232028 master-0 kubenswrapper[15202]: I0319 09:39:11.231806 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:39:11.232028 master-0 kubenswrapper[15202]: I0319 09:39:11.231967 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:39:11.232360 master-0 kubenswrapper[15202]: I0319 09:39:11.232144 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:39:11.232360 master-0 kubenswrapper[15202]: I0319 09:39:11.232179 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:39:11.239252 master-0 kubenswrapper[15202]: I0319 09:39:11.239177 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qd25m"] Mar 19 09:39:11.240962 master-0 kubenswrapper[15202]: I0319 09:39:11.240909 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.250843 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqflr\" (UniqueName: \"kubernetes.io/projected/2c72041e-60f6-43b1-b435-16874b591bd4-kube-api-access-hqflr\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251014 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c72041e-60f6-43b1-b435-16874b591bd4-serviceca\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251110 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c72041e-60f6-43b1-b435-16874b591bd4-host\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251248 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-proxy-ca-bundles\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251284 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d22e99-64ff-456e-878d-d68fef32b117-serving-cert\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251330 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdfr\" (UniqueName: \"kubernetes.io/projected/99d22e99-64ff-456e-878d-d68fef32b117-kube-api-access-5bdfr\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251370 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-client-ca\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.251513 master-0 kubenswrapper[15202]: I0319 09:39:11.251401 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-config\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.259495 master-0 kubenswrapper[15202]: I0319 09:39:11.256554 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-69bfd98cf-4dhhm"] Mar 19 09:39:11.304489 master-0 kubenswrapper[15202]: I0319 09:39:11.303714 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.308483 master-0 kubenswrapper[15202]: I0319 09:39:11.305099 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-bk5qk" Mar 19 09:39:11.314482 master-0 kubenswrapper[15202]: I0319 09:39:11.312185 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:39:11.314482 master-0 kubenswrapper[15202]: I0319 09:39:11.312260 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:39:11.314482 master-0 kubenswrapper[15202]: I0319 09:39:11.312530 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:39:11.314482 master-0 kubenswrapper[15202]: I0319 09:39:11.312645 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:39:11.314482 master-0 kubenswrapper[15202]: I0319 09:39:11.312702 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:39:11.314482 master-0 kubenswrapper[15202]: I0319 09:39:11.313346 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:39:11.325512 master-0 kubenswrapper[15202]: I0319 09:39:11.324357 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:39:11.338493 master-0 kubenswrapper[15202]: I0319 09:39:11.336275 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:39:11.338493 master-0 kubenswrapper[15202]: I0319 09:39:11.336776 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:39:11.338493 master-0 kubenswrapper[15202]: I0319 09:39:11.336995 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:39:11.338493 master-0 kubenswrapper[15202]: I0319 09:39:11.337215 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:39:11.338493 master-0 kubenswrapper[15202]: I0319 09:39:11.337816 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:39:11.338493 master-0 kubenswrapper[15202]: I0319 09:39:11.337968 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-2qc5w" Mar 19 09:39:11.344492 master-0 kubenswrapper[15202]: I0319 09:39:11.342639 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352421 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c72041e-60f6-43b1-b435-16874b591bd4-host\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352512 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-audit-policies\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352547 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-proxy-ca-bundles\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352548 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352643 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c72041e-60f6-43b1-b435-16874b591bd4-host\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352574 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352895 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352932 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d22e99-64ff-456e-878d-d68fef32b117-serving-cert\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.352998 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353037 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-audit-dir\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353076 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6x8x\" (UniqueName: \"kubernetes.io/projected/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-kube-api-access-r6x8x\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353160 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdfr\" (UniqueName: \"kubernetes.io/projected/99d22e99-64ff-456e-878d-d68fef32b117-kube-api-access-5bdfr\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353253 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-error\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353303 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-client-ca\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353345 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-config\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353451 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353498 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-session\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.353580 master-0 kubenswrapper[15202]: I0319 09:39:11.353548 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-login\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.354376 master-0 kubenswrapper[15202]: I0319 09:39:11.353645 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqflr\" (UniqueName: \"kubernetes.io/projected/2c72041e-60f6-43b1-b435-16874b591bd4-kube-api-access-hqflr\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.354376 master-0 kubenswrapper[15202]: I0319 09:39:11.353690 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.354376 master-0 kubenswrapper[15202]: I0319 09:39:11.353723 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.354376 master-0 kubenswrapper[15202]: I0319 09:39:11.353750 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.354376 master-0 kubenswrapper[15202]: I0319 09:39:11.353798 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c72041e-60f6-43b1-b435-16874b591bd4-serviceca\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.354376 master-0 kubenswrapper[15202]: I0319 09:39:11.353813 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-proxy-ca-bundles\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.358487 master-0 kubenswrapper[15202]: I0319 09:39:11.354625 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-client-ca\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.358487 master-0 kubenswrapper[15202]: I0319 09:39:11.355158 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-config\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.358487 master-0 kubenswrapper[15202]: I0319 09:39:11.355591 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c72041e-60f6-43b1-b435-16874b591bd4-serviceca\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.365492 master-0 kubenswrapper[15202]: I0319 09:39:11.361431 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v"] Mar 19 09:39:11.377094 master-0 kubenswrapper[15202]: I0319 09:39:11.377030 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d22e99-64ff-456e-878d-d68fef32b117-serving-cert\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.379771 master-0 kubenswrapper[15202]: I0319 09:39:11.379729 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:39:11.381534 master-0 kubenswrapper[15202]: I0319 09:39:11.381487 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.383114 master-0 kubenswrapper[15202]: I0319 09:39:11.382737 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqflr\" (UniqueName: \"kubernetes.io/projected/2c72041e-60f6-43b1-b435-16874b591bd4-kube-api-access-hqflr\") pod \"node-ca-qd25m\" (UID: \"2c72041e-60f6-43b1-b435-16874b591bd4\") " pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.383860 master-0 kubenswrapper[15202]: I0319 09:39:11.383820 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdfr\" (UniqueName: \"kubernetes.io/projected/99d22e99-64ff-456e-878d-d68fef32b117-kube-api-access-5bdfr\") pod \"controller-manager-57bfdb854-c5vtx\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.385521 master-0 kubenswrapper[15202]: I0319 09:39:11.385035 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:39:11.385521 master-0 kubenswrapper[15202]: I0319 09:39:11.385271 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:39:11.385521 master-0 kubenswrapper[15202]: I0319 09:39:11.385296 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:39:11.385521 master-0 kubenswrapper[15202]: I0319 09:39:11.385312 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-tgqwm" Mar 19 09:39:11.386789 master-0 kubenswrapper[15202]: I0319 09:39:11.386728 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69bfd98cf-4dhhm"] Mar 19 09:39:11.389937 master-0 kubenswrapper[15202]: I0319 09:39:11.389900 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:39:11.390146 master-0 kubenswrapper[15202]: I0319 09:39:11.390118 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:39:11.392892 master-0 kubenswrapper[15202]: I0319 09:39:11.392853 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v"] Mar 19 09:39:11.398022 master-0 kubenswrapper[15202]: I0319 09:39:11.397985 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bfdb854-c5vtx"] Mar 19 09:39:11.455934 master-0 kubenswrapper[15202]: I0319 09:39:11.455851 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-session\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.455934 master-0 kubenswrapper[15202]: I0319 09:39:11.455940 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxl28\" (UniqueName: \"kubernetes.io/projected/4eca1dbd-d183-407d-99df-eea5aee3474d-kube-api-access-xxl28\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.456554 master-0 kubenswrapper[15202]: I0319 09:39:11.455986 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-login\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.456554 master-0 kubenswrapper[15202]: I0319 09:39:11.456162 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.456554 master-0 kubenswrapper[15202]: I0319 09:39:11.456455 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.456554 master-0 kubenswrapper[15202]: I0319 09:39:11.456545 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.456970 master-0 kubenswrapper[15202]: I0319 09:39:11.456645 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-audit-policies\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.456970 master-0 kubenswrapper[15202]: I0319 09:39:11.456681 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.456970 master-0 kubenswrapper[15202]: I0319 09:39:11.456730 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.457214 master-0 kubenswrapper[15202]: I0319 09:39:11.457041 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-client-ca\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.457214 master-0 kubenswrapper[15202]: I0319 09:39:11.457083 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.457214 master-0 kubenswrapper[15202]: I0319 09:39:11.457107 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-audit-dir\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.457214 master-0 kubenswrapper[15202]: I0319 09:39:11.457147 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-audit-dir\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.457385 master-0 kubenswrapper[15202]: I0319 09:39:11.457204 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6x8x\" (UniqueName: \"kubernetes.io/projected/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-kube-api-access-r6x8x\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.457385 master-0 kubenswrapper[15202]: I0319 09:39:11.457316 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eca1dbd-d183-407d-99df-eea5aee3474d-serving-cert\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.457385 master-0 kubenswrapper[15202]: I0319 09:39:11.457356 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-error\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.457385 master-0 kubenswrapper[15202]: I0319 09:39:11.457376 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-config\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.457631 master-0 kubenswrapper[15202]: I0319 09:39:11.457510 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.458262 master-0 kubenswrapper[15202]: I0319 09:39:11.458112 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-audit-policies\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.458555 master-0 kubenswrapper[15202]: I0319 09:39:11.458503 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.458727 master-0 kubenswrapper[15202]: I0319 09:39:11.458687 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.459171 master-0 kubenswrapper[15202]: I0319 09:39:11.459104 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.459658 master-0 kubenswrapper[15202]: I0319 09:39:11.459622 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-login\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.460522 master-0 kubenswrapper[15202]: I0319 09:39:11.460453 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.461005 master-0 kubenswrapper[15202]: I0319 09:39:11.460965 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-error\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.463025 master-0 kubenswrapper[15202]: I0319 09:39:11.461992 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-session\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.463025 master-0 kubenswrapper[15202]: I0319 09:39:11.462519 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.463025 master-0 kubenswrapper[15202]: I0319 09:39:11.462928 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.463442 master-0 kubenswrapper[15202]: I0319 09:39:11.463399 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.476161 master-0 kubenswrapper[15202]: I0319 09:39:11.476072 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6x8x\" (UniqueName: \"kubernetes.io/projected/a00456f4-7f6b-4c56-bcd6-72e0f04b84d6-kube-api-access-r6x8x\") pod \"oauth-openshift-69bfd98cf-4dhhm\" (UID: \"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6\") " pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.545578 master-0 kubenswrapper[15202]: I0319 09:39:11.544870 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:11.558283 master-0 kubenswrapper[15202]: I0319 09:39:11.558158 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-client-ca\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.558283 master-0 kubenswrapper[15202]: I0319 09:39:11.558292 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eca1dbd-d183-407d-99df-eea5aee3474d-serving-cert\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.558714 master-0 kubenswrapper[15202]: I0319 09:39:11.558328 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-config\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.558714 master-0 kubenswrapper[15202]: I0319 09:39:11.558387 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxl28\" (UniqueName: \"kubernetes.io/projected/4eca1dbd-d183-407d-99df-eea5aee3474d-kube-api-access-xxl28\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.559588 master-0 kubenswrapper[15202]: I0319 09:39:11.559542 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-client-ca\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.560251 master-0 kubenswrapper[15202]: I0319 09:39:11.560160 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-config\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.562682 master-0 kubenswrapper[15202]: I0319 09:39:11.562616 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eca1dbd-d183-407d-99df-eea5aee3474d-serving-cert\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.584589 master-0 kubenswrapper[15202]: I0319 09:39:11.584537 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxl28\" (UniqueName: \"kubernetes.io/projected/4eca1dbd-d183-407d-99df-eea5aee3474d-kube-api-access-xxl28\") pod \"route-controller-manager-55f5cd545d-pkh9v\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.615764 master-0 kubenswrapper[15202]: I0319 09:39:11.615669 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qd25m" Mar 19 09:39:11.648539 master-0 kubenswrapper[15202]: W0319 09:39:11.648447 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c72041e_60f6_43b1_b435_16874b591bd4.slice/crio-cd972eb8b696e68424a89cee8b1de022f1e86ff7c63c80c2a7a469032c853ad8 WatchSource:0}: Error finding container cd972eb8b696e68424a89cee8b1de022f1e86ff7c63c80c2a7a469032c853ad8: Status 404 returned error can't find the container with id cd972eb8b696e68424a89cee8b1de022f1e86ff7c63c80c2a7a469032c853ad8 Mar 19 09:39:11.651236 master-0 kubenswrapper[15202]: I0319 09:39:11.651203 15202 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:39:11.727825 master-0 kubenswrapper[15202]: I0319 09:39:11.727581 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:11.743975 master-0 kubenswrapper[15202]: I0319 09:39:11.743901 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:11.983312 master-0 kubenswrapper[15202]: I0319 09:39:11.983127 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bfdb854-c5vtx"] Mar 19 09:39:11.989724 master-0 kubenswrapper[15202]: W0319 09:39:11.989662 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d22e99_64ff_456e_878d_d68fef32b117.slice/crio-fe19c5b27dac1656474706a53fa961b391af75de2b39ee0544a86bb35fa8f67c WatchSource:0}: Error finding container fe19c5b27dac1656474706a53fa961b391af75de2b39ee0544a86bb35fa8f67c: Status 404 returned error can't find the container with id fe19c5b27dac1656474706a53fa961b391af75de2b39ee0544a86bb35fa8f67c Mar 19 09:39:12.163752 master-0 kubenswrapper[15202]: I0319 09:39:12.163691 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69bfd98cf-4dhhm"] Mar 19 09:39:12.166348 master-0 kubenswrapper[15202]: W0319 09:39:12.166262 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00456f4_7f6b_4c56_bcd6_72e0f04b84d6.slice/crio-8d1e718cb28cb211c303589e96766613f2c6272b63dd047ea3c79882f6d65a8e WatchSource:0}: Error finding container 8d1e718cb28cb211c303589e96766613f2c6272b63dd047ea3c79882f6d65a8e: Status 404 returned error can't find the container with id 8d1e718cb28cb211c303589e96766613f2c6272b63dd047ea3c79882f6d65a8e Mar 19 09:39:12.251998 master-0 kubenswrapper[15202]: I0319 09:39:12.251935 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v"] Mar 19 09:39:12.252576 master-0 kubenswrapper[15202]: W0319 09:39:12.252374 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eca1dbd_d183_407d_99df_eea5aee3474d.slice/crio-b24304450dfe3386c2fd82981b6423addab4f051b2c62d16f4577bc4f72f6246 WatchSource:0}: Error finding container b24304450dfe3386c2fd82981b6423addab4f051b2c62d16f4577bc4f72f6246: Status 404 returned error can't find the container with id b24304450dfe3386c2fd82981b6423addab4f051b2c62d16f4577bc4f72f6246 Mar 19 09:39:12.374795 master-0 kubenswrapper[15202]: I0319 09:39:12.373728 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" event={"ID":"99d22e99-64ff-456e-878d-d68fef32b117","Type":"ContainerStarted","Data":"12a1c38306f21000a7ffcb590c62cefad13f5e6406002c495d46830e8bb5389a"} Mar 19 09:39:12.374795 master-0 kubenswrapper[15202]: I0319 09:39:12.373782 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" event={"ID":"99d22e99-64ff-456e-878d-d68fef32b117","Type":"ContainerStarted","Data":"fe19c5b27dac1656474706a53fa961b391af75de2b39ee0544a86bb35fa8f67c"} Mar 19 09:39:12.379487 master-0 kubenswrapper[15202]: I0319 09:39:12.377316 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:12.379487 master-0 kubenswrapper[15202]: I0319 09:39:12.378344 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" event={"ID":"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6","Type":"ContainerStarted","Data":"8d1e718cb28cb211c303589e96766613f2c6272b63dd047ea3c79882f6d65a8e"} Mar 19 09:39:12.384915 master-0 kubenswrapper[15202]: I0319 09:39:12.384850 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" event={"ID":"4eca1dbd-d183-407d-99df-eea5aee3474d","Type":"ContainerStarted","Data":"b24304450dfe3386c2fd82981b6423addab4f051b2c62d16f4577bc4f72f6246"} Mar 19 09:39:12.388075 master-0 kubenswrapper[15202]: I0319 09:39:12.388030 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:12.389030 master-0 kubenswrapper[15202]: I0319 09:39:12.388984 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qd25m" event={"ID":"2c72041e-60f6-43b1-b435-16874b591bd4","Type":"ContainerStarted","Data":"cd972eb8b696e68424a89cee8b1de022f1e86ff7c63c80c2a7a469032c853ad8"} Mar 19 09:39:12.408037 master-0 kubenswrapper[15202]: I0319 09:39:12.407967 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" podStartSLOduration=216.407949856 podStartE2EDuration="3m36.407949856s" podCreationTimestamp="2026-03-19 09:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:39:12.399754729 +0000 UTC m=+869.785169545" watchObservedRunningTime="2026-03-19 09:39:12.407949856 +0000 UTC m=+869.793364672" Mar 19 09:39:13.401335 master-0 kubenswrapper[15202]: I0319 09:39:13.400437 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" event={"ID":"a00456f4-7f6b-4c56-bcd6-72e0f04b84d6","Type":"ContainerStarted","Data":"4a7725875c549db66198d7e892a5f918cd43fd7a04f81e34ab0c54cbdb028ba7"} Mar 19 09:39:13.401335 master-0 kubenswrapper[15202]: I0319 09:39:13.400572 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:13.408394 master-0 kubenswrapper[15202]: I0319 09:39:13.408326 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" event={"ID":"4eca1dbd-d183-407d-99df-eea5aee3474d","Type":"ContainerStarted","Data":"2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760"} Mar 19 09:39:13.408505 master-0 kubenswrapper[15202]: I0319 09:39:13.408433 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" Mar 19 09:39:13.408838 master-0 kubenswrapper[15202]: I0319 09:39:13.408779 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:13.414155 master-0 kubenswrapper[15202]: I0319 09:39:13.414066 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:13.468676 master-0 kubenswrapper[15202]: I0319 09:39:13.468595 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-69bfd98cf-4dhhm" podStartSLOduration=217.46856487 podStartE2EDuration="3m37.46856487s" podCreationTimestamp="2026-03-19 09:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:39:13.467959775 +0000 UTC m=+870.853374601" watchObservedRunningTime="2026-03-19 09:39:13.46856487 +0000 UTC m=+870.853979696" Mar 19 09:39:13.543323 master-0 kubenswrapper[15202]: I0319 09:39:13.543203 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" podStartSLOduration=217.543169614 podStartE2EDuration="3m37.543169614s" podCreationTimestamp="2026-03-19 09:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:39:13.539113327 +0000 UTC m=+870.924528143" watchObservedRunningTime="2026-03-19 09:39:13.543169614 +0000 UTC m=+870.928584450" Mar 19 09:39:15.425828 master-0 kubenswrapper[15202]: I0319 09:39:15.424333 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qd25m" event={"ID":"2c72041e-60f6-43b1-b435-16874b591bd4","Type":"ContainerStarted","Data":"7cde06e865231ad84aa9df0054365e6467417aae4b031dfddf0a4a85e7b1f892"} Mar 19 09:39:15.450828 master-0 kubenswrapper[15202]: I0319 09:39:15.450719 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qd25m" podStartSLOduration=57.64268171 podStartE2EDuration="1m0.450695533s" podCreationTimestamp="2026-03-19 09:38:15 +0000 UTC" firstStartedPulling="2026-03-19 09:39:11.651082246 +0000 UTC m=+869.036497062" lastFinishedPulling="2026-03-19 09:39:14.459096069 +0000 UTC m=+871.844510885" observedRunningTime="2026-03-19 09:39:15.443981701 +0000 UTC m=+872.829396517" watchObservedRunningTime="2026-03-19 09:39:15.450695533 +0000 UTC m=+872.836110369" Mar 19 09:39:26.653500 master-0 kubenswrapper[15202]: I0319 09:39:26.653358 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bfdb854-c5vtx"] Mar 19 09:39:26.654371 master-0 kubenswrapper[15202]: I0319 09:39:26.654205 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" podUID="99d22e99-64ff-456e-878d-d68fef32b117" containerName="controller-manager" containerID="cri-o://12a1c38306f21000a7ffcb590c62cefad13f5e6406002c495d46830e8bb5389a" gracePeriod=30 Mar 19 09:39:26.703162 master-0 kubenswrapper[15202]: I0319 09:39:26.703069 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v"] Mar 19 09:39:26.704188 master-0 kubenswrapper[15202]: I0319 09:39:26.704102 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" podUID="4eca1dbd-d183-407d-99df-eea5aee3474d" containerName="route-controller-manager" containerID="cri-o://2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760" gracePeriod=30 Mar 19 09:39:27.432365 master-0 kubenswrapper[15202]: I0319 09:39:27.432293 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:27.532098 master-0 kubenswrapper[15202]: I0319 09:39:27.532015 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-config\") pod \"4eca1dbd-d183-407d-99df-eea5aee3474d\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " Mar 19 09:39:27.532444 master-0 kubenswrapper[15202]: I0319 09:39:27.532171 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eca1dbd-d183-407d-99df-eea5aee3474d-serving-cert\") pod \"4eca1dbd-d183-407d-99df-eea5aee3474d\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " Mar 19 09:39:27.532444 master-0 kubenswrapper[15202]: I0319 09:39:27.532323 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxl28\" (UniqueName: \"kubernetes.io/projected/4eca1dbd-d183-407d-99df-eea5aee3474d-kube-api-access-xxl28\") pod \"4eca1dbd-d183-407d-99df-eea5aee3474d\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " Mar 19 09:39:27.532564 master-0 kubenswrapper[15202]: I0319 09:39:27.532492 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-client-ca\") pod \"4eca1dbd-d183-407d-99df-eea5aee3474d\" (UID: \"4eca1dbd-d183-407d-99df-eea5aee3474d\") " Mar 19 09:39:27.534153 master-0 kubenswrapper[15202]: I0319 09:39:27.534086 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-config" (OuterVolumeSpecName: "config") pod "4eca1dbd-d183-407d-99df-eea5aee3474d" (UID: "4eca1dbd-d183-407d-99df-eea5aee3474d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:39:27.534153 master-0 kubenswrapper[15202]: I0319 09:39:27.534121 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4eca1dbd-d183-407d-99df-eea5aee3474d" (UID: "4eca1dbd-d183-407d-99df-eea5aee3474d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:39:27.537048 master-0 kubenswrapper[15202]: I0319 09:39:27.536984 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eca1dbd-d183-407d-99df-eea5aee3474d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4eca1dbd-d183-407d-99df-eea5aee3474d" (UID: "4eca1dbd-d183-407d-99df-eea5aee3474d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:39:27.539879 master-0 kubenswrapper[15202]: I0319 09:39:27.538965 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eca1dbd-d183-407d-99df-eea5aee3474d-kube-api-access-xxl28" (OuterVolumeSpecName: "kube-api-access-xxl28") pod "4eca1dbd-d183-407d-99df-eea5aee3474d" (UID: "4eca1dbd-d183-407d-99df-eea5aee3474d"). InnerVolumeSpecName "kube-api-access-xxl28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:39:27.579053 master-0 kubenswrapper[15202]: I0319 09:39:27.578991 15202 generic.go:334] "Generic (PLEG): container finished" podID="99d22e99-64ff-456e-878d-d68fef32b117" containerID="12a1c38306f21000a7ffcb590c62cefad13f5e6406002c495d46830e8bb5389a" exitCode=0 Mar 19 09:39:27.579202 master-0 kubenswrapper[15202]: I0319 09:39:27.579085 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" event={"ID":"99d22e99-64ff-456e-878d-d68fef32b117","Type":"ContainerDied","Data":"12a1c38306f21000a7ffcb590c62cefad13f5e6406002c495d46830e8bb5389a"} Mar 19 09:39:27.581941 master-0 kubenswrapper[15202]: I0319 09:39:27.581898 15202 generic.go:334] "Generic (PLEG): container finished" podID="4eca1dbd-d183-407d-99df-eea5aee3474d" containerID="2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760" exitCode=0 Mar 19 09:39:27.582024 master-0 kubenswrapper[15202]: I0319 09:39:27.581973 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" Mar 19 09:39:27.582024 master-0 kubenswrapper[15202]: I0319 09:39:27.581967 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" event={"ID":"4eca1dbd-d183-407d-99df-eea5aee3474d","Type":"ContainerDied","Data":"2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760"} Mar 19 09:39:27.582092 master-0 kubenswrapper[15202]: I0319 09:39:27.582064 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v" event={"ID":"4eca1dbd-d183-407d-99df-eea5aee3474d","Type":"ContainerDied","Data":"b24304450dfe3386c2fd82981b6423addab4f051b2c62d16f4577bc4f72f6246"} Mar 19 09:39:27.582126 master-0 kubenswrapper[15202]: I0319 09:39:27.582093 15202 scope.go:117] "RemoveContainer" containerID="2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760" Mar 19 09:39:27.617245 master-0 kubenswrapper[15202]: I0319 09:39:27.617127 15202 scope.go:117] "RemoveContainer" containerID="2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760" Mar 19 09:39:27.617867 master-0 kubenswrapper[15202]: E0319 09:39:27.617819 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760\": container with ID starting with 2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760 not found: ID does not exist" containerID="2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760" Mar 19 09:39:27.617937 master-0 kubenswrapper[15202]: I0319 09:39:27.617873 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760"} err="failed to get container status \"2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760\": rpc error: code = NotFound desc = could not find container \"2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760\": container with ID starting with 2e12e5407dafd3cdfa6902799d170f257ecc970c1566c709603b0c706d383760 not found: ID does not exist" Mar 19 09:39:27.639575 master-0 kubenswrapper[15202]: I0319 09:39:27.637587 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.639575 master-0 kubenswrapper[15202]: I0319 09:39:27.637644 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eca1dbd-d183-407d-99df-eea5aee3474d-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.639575 master-0 kubenswrapper[15202]: I0319 09:39:27.637659 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxl28\" (UniqueName: \"kubernetes.io/projected/4eca1dbd-d183-407d-99df-eea5aee3474d-kube-api-access-xxl28\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.639575 master-0 kubenswrapper[15202]: I0319 09:39:27.637670 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4eca1dbd-d183-407d-99df-eea5aee3474d-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.643521 master-0 kubenswrapper[15202]: I0319 09:39:27.642273 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v"] Mar 19 09:39:27.659543 master-0 kubenswrapper[15202]: I0319 09:39:27.659444 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f5cd545d-pkh9v"] Mar 19 09:39:27.705971 master-0 kubenswrapper[15202]: I0319 09:39:27.705881 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:27.843416 master-0 kubenswrapper[15202]: I0319 09:39:27.843325 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d22e99-64ff-456e-878d-d68fef32b117-serving-cert\") pod \"99d22e99-64ff-456e-878d-d68fef32b117\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " Mar 19 09:39:27.843828 master-0 kubenswrapper[15202]: I0319 09:39:27.843455 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bdfr\" (UniqueName: \"kubernetes.io/projected/99d22e99-64ff-456e-878d-d68fef32b117-kube-api-access-5bdfr\") pod \"99d22e99-64ff-456e-878d-d68fef32b117\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " Mar 19 09:39:27.843828 master-0 kubenswrapper[15202]: I0319 09:39:27.843560 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-client-ca\") pod \"99d22e99-64ff-456e-878d-d68fef32b117\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " Mar 19 09:39:27.843828 master-0 kubenswrapper[15202]: I0319 09:39:27.843742 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-config\") pod \"99d22e99-64ff-456e-878d-d68fef32b117\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " Mar 19 09:39:27.843977 master-0 kubenswrapper[15202]: I0319 09:39:27.843849 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-proxy-ca-bundles\") pod \"99d22e99-64ff-456e-878d-d68fef32b117\" (UID: \"99d22e99-64ff-456e-878d-d68fef32b117\") " Mar 19 09:39:27.844171 master-0 kubenswrapper[15202]: I0319 09:39:27.844093 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-client-ca" (OuterVolumeSpecName: "client-ca") pod "99d22e99-64ff-456e-878d-d68fef32b117" (UID: "99d22e99-64ff-456e-878d-d68fef32b117"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:39:27.844516 master-0 kubenswrapper[15202]: I0319 09:39:27.844439 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-config" (OuterVolumeSpecName: "config") pod "99d22e99-64ff-456e-878d-d68fef32b117" (UID: "99d22e99-64ff-456e-878d-d68fef32b117"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:39:27.845108 master-0 kubenswrapper[15202]: I0319 09:39:27.845062 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99d22e99-64ff-456e-878d-d68fef32b117" (UID: "99d22e99-64ff-456e-878d-d68fef32b117"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:39:27.845230 master-0 kubenswrapper[15202]: I0319 09:39:27.845186 15202 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.845284 master-0 kubenswrapper[15202]: I0319 09:39:27.845234 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.848359 master-0 kubenswrapper[15202]: I0319 09:39:27.848299 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d22e99-64ff-456e-878d-d68fef32b117-kube-api-access-5bdfr" (OuterVolumeSpecName: "kube-api-access-5bdfr") pod "99d22e99-64ff-456e-878d-d68fef32b117" (UID: "99d22e99-64ff-456e-878d-d68fef32b117"). InnerVolumeSpecName "kube-api-access-5bdfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:39:27.848359 master-0 kubenswrapper[15202]: I0319 09:39:27.848322 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d22e99-64ff-456e-878d-d68fef32b117-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99d22e99-64ff-456e-878d-d68fef32b117" (UID: "99d22e99-64ff-456e-878d-d68fef32b117"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:39:27.947397 master-0 kubenswrapper[15202]: I0319 09:39:27.947309 15202 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99d22e99-64ff-456e-878d-d68fef32b117-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.947397 master-0 kubenswrapper[15202]: I0319 09:39:27.947374 15202 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99d22e99-64ff-456e-878d-d68fef32b117-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:27.947397 master-0 kubenswrapper[15202]: I0319 09:39:27.947396 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bdfr\" (UniqueName: \"kubernetes.io/projected/99d22e99-64ff-456e-878d-d68fef32b117-kube-api-access-5bdfr\") on node \"master-0\" DevicePath \"\"" Mar 19 09:39:28.225881 master-0 kubenswrapper[15202]: I0319 09:39:28.225763 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj"] Mar 19 09:39:28.226508 master-0 kubenswrapper[15202]: E0319 09:39:28.226406 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d22e99-64ff-456e-878d-d68fef32b117" containerName="controller-manager" Mar 19 09:39:28.226508 master-0 kubenswrapper[15202]: I0319 09:39:28.226444 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d22e99-64ff-456e-878d-d68fef32b117" containerName="controller-manager" Mar 19 09:39:28.226793 master-0 kubenswrapper[15202]: E0319 09:39:28.226520 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eca1dbd-d183-407d-99df-eea5aee3474d" containerName="route-controller-manager" Mar 19 09:39:28.226793 master-0 kubenswrapper[15202]: I0319 09:39:28.226544 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eca1dbd-d183-407d-99df-eea5aee3474d" containerName="route-controller-manager" Mar 19 09:39:28.226938 master-0 kubenswrapper[15202]: I0319 09:39:28.226811 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d22e99-64ff-456e-878d-d68fef32b117" containerName="controller-manager" Mar 19 09:39:28.226938 master-0 kubenswrapper[15202]: I0319 09:39:28.226862 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eca1dbd-d183-407d-99df-eea5aee3474d" containerName="route-controller-manager" Mar 19 09:39:28.227820 master-0 kubenswrapper[15202]: I0319 09:39:28.227780 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.238111 master-0 kubenswrapper[15202]: I0319 09:39:28.238009 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd"] Mar 19 09:39:28.239555 master-0 kubenswrapper[15202]: I0319 09:39:28.239460 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.241910 master-0 kubenswrapper[15202]: I0319 09:39:28.241818 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-tgqwm" Mar 19 09:39:28.243120 master-0 kubenswrapper[15202]: I0319 09:39:28.243050 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:39:28.243652 master-0 kubenswrapper[15202]: I0319 09:39:28.243593 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:39:28.243839 master-0 kubenswrapper[15202]: I0319 09:39:28.243785 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:39:28.244377 master-0 kubenswrapper[15202]: I0319 09:39:28.244301 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:39:28.246754 master-0 kubenswrapper[15202]: I0319 09:39:28.246704 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:39:28.249448 master-0 kubenswrapper[15202]: I0319 09:39:28.249385 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj"] Mar 19 09:39:28.257792 master-0 kubenswrapper[15202]: I0319 09:39:28.257703 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd"] Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354068 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-client-ca\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354234 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4gp\" (UniqueName: \"kubernetes.io/projected/f3f9be63-dd40-4e94-a238-5a9642396a1b-kube-api-access-2k4gp\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354352 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-config\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354387 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f9be63-dd40-4e94-a238-5a9642396a1b-client-ca\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354436 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flb9n\" (UniqueName: \"kubernetes.io/projected/94df5ea6-5e23-45d1-aae3-88f41b320eaf-kube-api-access-flb9n\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354506 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-proxy-ca-bundles\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354538 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f9be63-dd40-4e94-a238-5a9642396a1b-config\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354577 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94df5ea6-5e23-45d1-aae3-88f41b320eaf-serving-cert\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.355496 master-0 kubenswrapper[15202]: I0319 09:39:28.354621 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f9be63-dd40-4e94-a238-5a9642396a1b-serving-cert\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.456404 master-0 kubenswrapper[15202]: I0319 09:39:28.456299 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f9be63-dd40-4e94-a238-5a9642396a1b-serving-cert\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.456805 master-0 kubenswrapper[15202]: I0319 09:39:28.456453 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-client-ca\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.456805 master-0 kubenswrapper[15202]: I0319 09:39:28.456693 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4gp\" (UniqueName: \"kubernetes.io/projected/f3f9be63-dd40-4e94-a238-5a9642396a1b-kube-api-access-2k4gp\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.457391 master-0 kubenswrapper[15202]: I0319 09:39:28.457331 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-client-ca\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.457461 master-0 kubenswrapper[15202]: I0319 09:39:28.457424 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-config\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.457725 master-0 kubenswrapper[15202]: I0319 09:39:28.457600 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f9be63-dd40-4e94-a238-5a9642396a1b-client-ca\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.459647 master-0 kubenswrapper[15202]: I0319 09:39:28.458443 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-config\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.459772 master-0 kubenswrapper[15202]: I0319 09:39:28.459430 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3f9be63-dd40-4e94-a238-5a9642396a1b-client-ca\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.459772 master-0 kubenswrapper[15202]: I0319 09:39:28.459717 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flb9n\" (UniqueName: \"kubernetes.io/projected/94df5ea6-5e23-45d1-aae3-88f41b320eaf-kube-api-access-flb9n\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.459876 master-0 kubenswrapper[15202]: I0319 09:39:28.459817 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-proxy-ca-bundles\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.459876 master-0 kubenswrapper[15202]: I0319 09:39:28.459863 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f9be63-dd40-4e94-a238-5a9642396a1b-config\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.461299 master-0 kubenswrapper[15202]: I0319 09:39:28.460966 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94df5ea6-5e23-45d1-aae3-88f41b320eaf-proxy-ca-bundles\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.461953 master-0 kubenswrapper[15202]: I0319 09:39:28.461807 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f9be63-dd40-4e94-a238-5a9642396a1b-config\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.461953 master-0 kubenswrapper[15202]: I0319 09:39:28.459962 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94df5ea6-5e23-45d1-aae3-88f41b320eaf-serving-cert\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.464707 master-0 kubenswrapper[15202]: I0319 09:39:28.464655 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f9be63-dd40-4e94-a238-5a9642396a1b-serving-cert\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.465962 master-0 kubenswrapper[15202]: I0319 09:39:28.465904 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94df5ea6-5e23-45d1-aae3-88f41b320eaf-serving-cert\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.475089 master-0 kubenswrapper[15202]: I0319 09:39:28.475058 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4gp\" (UniqueName: \"kubernetes.io/projected/f3f9be63-dd40-4e94-a238-5a9642396a1b-kube-api-access-2k4gp\") pod \"route-controller-manager-57dc475b7c-7h2xd\" (UID: \"f3f9be63-dd40-4e94-a238-5a9642396a1b\") " pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.476502 master-0 kubenswrapper[15202]: I0319 09:39:28.476379 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flb9n\" (UniqueName: \"kubernetes.io/projected/94df5ea6-5e23-45d1-aae3-88f41b320eaf-kube-api-access-flb9n\") pod \"controller-manager-5cbdcbd8d7-wz2vj\" (UID: \"94df5ea6-5e23-45d1-aae3-88f41b320eaf\") " pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.572303 master-0 kubenswrapper[15202]: I0319 09:39:28.572173 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:28.607660 master-0 kubenswrapper[15202]: I0319 09:39:28.607583 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" event={"ID":"99d22e99-64ff-456e-878d-d68fef32b117","Type":"ContainerDied","Data":"fe19c5b27dac1656474706a53fa961b391af75de2b39ee0544a86bb35fa8f67c"} Mar 19 09:39:28.608073 master-0 kubenswrapper[15202]: I0319 09:39:28.608046 15202 scope.go:117] "RemoveContainer" containerID="12a1c38306f21000a7ffcb590c62cefad13f5e6406002c495d46830e8bb5389a" Mar 19 09:39:28.608286 master-0 kubenswrapper[15202]: I0319 09:39:28.607803 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bfdb854-c5vtx" Mar 19 09:39:28.612760 master-0 kubenswrapper[15202]: I0319 09:39:28.612702 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:28.707798 master-0 kubenswrapper[15202]: I0319 09:39:28.707753 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bfdb854-c5vtx"] Mar 19 09:39:28.715374 master-0 kubenswrapper[15202]: I0319 09:39:28.713852 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57bfdb854-c5vtx"] Mar 19 09:39:28.824116 master-0 kubenswrapper[15202]: I0319 09:39:28.823957 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eca1dbd-d183-407d-99df-eea5aee3474d" path="/var/lib/kubelet/pods/4eca1dbd-d183-407d-99df-eea5aee3474d/volumes" Mar 19 09:39:28.825103 master-0 kubenswrapper[15202]: I0319 09:39:28.825067 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d22e99-64ff-456e-878d-d68fef32b117" path="/var/lib/kubelet/pods/99d22e99-64ff-456e-878d-d68fef32b117/volumes" Mar 19 09:39:29.076225 master-0 kubenswrapper[15202]: I0319 09:39:29.076070 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj"] Mar 19 09:39:29.076425 master-0 kubenswrapper[15202]: W0319 09:39:29.076354 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94df5ea6_5e23_45d1_aae3_88f41b320eaf.slice/crio-fe46134af08c9c49486c9da455c32cfed8d12c38f8e5e66c2057037ac94ae450 WatchSource:0}: Error finding container fe46134af08c9c49486c9da455c32cfed8d12c38f8e5e66c2057037ac94ae450: Status 404 returned error can't find the container with id fe46134af08c9c49486c9da455c32cfed8d12c38f8e5e66c2057037ac94ae450 Mar 19 09:39:29.169963 master-0 kubenswrapper[15202]: I0319 09:39:29.169899 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd"] Mar 19 09:39:29.619176 master-0 kubenswrapper[15202]: I0319 09:39:29.618987 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" event={"ID":"f3f9be63-dd40-4e94-a238-5a9642396a1b","Type":"ContainerStarted","Data":"cb213f7e00d463b3c7dd87006cc5cff20f9b54c96241dbe489172ba0ccdf3de0"} Mar 19 09:39:29.619176 master-0 kubenswrapper[15202]: I0319 09:39:29.619048 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" event={"ID":"f3f9be63-dd40-4e94-a238-5a9642396a1b","Type":"ContainerStarted","Data":"0990d94cae92cbc6255430f243872e684b6895c173d768b3bf1fd045ffb51479"} Mar 19 09:39:29.621114 master-0 kubenswrapper[15202]: I0319 09:39:29.621070 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:29.626121 master-0 kubenswrapper[15202]: I0319 09:39:29.626044 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" event={"ID":"94df5ea6-5e23-45d1-aae3-88f41b320eaf","Type":"ContainerStarted","Data":"66e4e37344a6ec617a6b7580a5c399e8a4a09d07e9c64d6c6758ebf64709e1ce"} Mar 19 09:39:29.626560 master-0 kubenswrapper[15202]: I0319 09:39:29.626448 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" event={"ID":"94df5ea6-5e23-45d1-aae3-88f41b320eaf","Type":"ContainerStarted","Data":"fe46134af08c9c49486c9da455c32cfed8d12c38f8e5e66c2057037ac94ae450"} Mar 19 09:39:29.628438 master-0 kubenswrapper[15202]: I0319 09:39:29.628408 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:29.633944 master-0 kubenswrapper[15202]: I0319 09:39:29.633878 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" Mar 19 09:39:29.646679 master-0 kubenswrapper[15202]: I0319 09:39:29.644359 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" podStartSLOduration=3.644336579 podStartE2EDuration="3.644336579s" podCreationTimestamp="2026-03-19 09:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:39:29.64312393 +0000 UTC m=+887.028538766" watchObservedRunningTime="2026-03-19 09:39:29.644336579 +0000 UTC m=+887.029751395" Mar 19 09:39:29.669445 master-0 kubenswrapper[15202]: I0319 09:39:29.669332 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cbdcbd8d7-wz2vj" podStartSLOduration=3.669304749 podStartE2EDuration="3.669304749s" podCreationTimestamp="2026-03-19 09:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:39:29.669220687 +0000 UTC m=+887.054635513" watchObservedRunningTime="2026-03-19 09:39:29.669304749 +0000 UTC m=+887.054719565" Mar 19 09:39:29.984884 master-0 kubenswrapper[15202]: I0319 09:39:29.984813 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57dc475b7c-7h2xd" Mar 19 09:39:44.217832 master-0 kubenswrapper[15202]: I0319 09:39:44.217754 15202 scope.go:117] "RemoveContainer" containerID="ebf1733b19e744225a9e8c315e74e98e73b2be2483625475391f4ed66449d3f3" Mar 19 09:41:23.225049 master-0 kubenswrapper[15202]: I0319 09:41:23.224986 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-w2dvk"] Mar 19 09:41:23.226204 master-0 kubenswrapper[15202]: I0319 09:41:23.226169 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.230141 master-0 kubenswrapper[15202]: I0319 09:41:23.230071 15202 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 19 09:41:23.230314 master-0 kubenswrapper[15202]: I0319 09:41:23.230234 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 19 09:41:23.230391 master-0 kubenswrapper[15202]: I0319 09:41:23.230324 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 19 09:41:23.230594 master-0 kubenswrapper[15202]: I0319 09:41:23.230545 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 19 09:41:23.247494 master-0 kubenswrapper[15202]: I0319 09:41:23.247409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-os-client-config\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.247654 master-0 kubenswrapper[15202]: I0319 09:41:23.247611 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.247712 master-0 kubenswrapper[15202]: I0319 09:41:23.247674 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpsmg\" (UniqueName: \"kubernetes.io/projected/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-kube-api-access-xpsmg\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.271206 master-0 kubenswrapper[15202]: I0319 09:41:23.271122 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-w2dvk"] Mar 19 09:41:23.349830 master-0 kubenswrapper[15202]: I0319 09:41:23.349763 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsmg\" (UniqueName: \"kubernetes.io/projected/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-kube-api-access-xpsmg\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.349830 master-0 kubenswrapper[15202]: I0319 09:41:23.349832 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.350119 master-0 kubenswrapper[15202]: I0319 09:41:23.349973 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-os-client-config\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.351188 master-0 kubenswrapper[15202]: I0319 09:41:23.350921 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.354175 master-0 kubenswrapper[15202]: I0319 09:41:23.354120 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-os-client-config\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.370310 master-0 kubenswrapper[15202]: I0319 09:41:23.370257 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpsmg\" (UniqueName: \"kubernetes.io/projected/55374844-a65f-4fc6-a2c6-818e8c8d5ef7-kube-api-access-xpsmg\") pod \"sushy-emulator-59477995f9-w2dvk\" (UID: \"55374844-a65f-4fc6-a2c6-818e8c8d5ef7\") " pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:23.565605 master-0 kubenswrapper[15202]: I0319 09:41:23.565441 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:24.053883 master-0 kubenswrapper[15202]: I0319 09:41:24.053829 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-w2dvk"] Mar 19 09:41:24.760915 master-0 kubenswrapper[15202]: I0319 09:41:24.760816 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" event={"ID":"55374844-a65f-4fc6-a2c6-818e8c8d5ef7","Type":"ContainerStarted","Data":"f77981b39716dd42904160bad9d199072ec289352626c98be6aa421e8273b291"} Mar 19 09:41:30.828747 master-0 kubenswrapper[15202]: I0319 09:41:30.828391 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" event={"ID":"55374844-a65f-4fc6-a2c6-818e8c8d5ef7","Type":"ContainerStarted","Data":"745bdd09630a931939cd16d51d78b108d2d4b4000a483afd26998a42de9d76ea"} Mar 19 09:41:30.864597 master-0 kubenswrapper[15202]: I0319 09:41:30.864400 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" podStartSLOduration=1.428583761 podStartE2EDuration="7.864367151s" podCreationTimestamp="2026-03-19 09:41:23 +0000 UTC" firstStartedPulling="2026-03-19 09:41:24.058718713 +0000 UTC m=+1001.444133529" lastFinishedPulling="2026-03-19 09:41:30.494502103 +0000 UTC m=+1007.879916919" observedRunningTime="2026-03-19 09:41:30.859749797 +0000 UTC m=+1008.245164653" watchObservedRunningTime="2026-03-19 09:41:30.864367151 +0000 UTC m=+1008.249781967" Mar 19 09:41:33.566548 master-0 kubenswrapper[15202]: I0319 09:41:33.566410 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:33.566548 master-0 kubenswrapper[15202]: I0319 09:41:33.566539 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:33.590664 master-0 kubenswrapper[15202]: I0319 09:41:33.590559 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:33.852281 master-0 kubenswrapper[15202]: I0319 09:41:33.851974 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-59477995f9-w2dvk" Mar 19 09:41:53.905025 master-0 kubenswrapper[15202]: I0319 09:41:53.904946 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-676c49b655-wglrh"] Mar 19 09:41:53.907076 master-0 kubenswrapper[15202]: I0319 09:41:53.907043 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:53.920301 master-0 kubenswrapper[15202]: I0319 09:41:53.920215 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-676c49b655-wglrh"] Mar 19 09:41:53.949952 master-0 kubenswrapper[15202]: I0319 09:41:53.949894 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpn4c\" (UniqueName: \"kubernetes.io/projected/cc87d589-adb7-4000-8528-30bc1760d41b-kube-api-access-zpn4c\") pod \"nova-console-poller-676c49b655-wglrh\" (UID: \"cc87d589-adb7-4000-8528-30bc1760d41b\") " pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:53.950728 master-0 kubenswrapper[15202]: I0319 09:41:53.950692 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/cc87d589-adb7-4000-8528-30bc1760d41b-os-client-config\") pod \"nova-console-poller-676c49b655-wglrh\" (UID: \"cc87d589-adb7-4000-8528-30bc1760d41b\") " pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:54.052828 master-0 kubenswrapper[15202]: I0319 09:41:54.052770 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpn4c\" (UniqueName: \"kubernetes.io/projected/cc87d589-adb7-4000-8528-30bc1760d41b-kube-api-access-zpn4c\") pod \"nova-console-poller-676c49b655-wglrh\" (UID: \"cc87d589-adb7-4000-8528-30bc1760d41b\") " pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:54.052828 master-0 kubenswrapper[15202]: I0319 09:41:54.052863 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/cc87d589-adb7-4000-8528-30bc1760d41b-os-client-config\") pod \"nova-console-poller-676c49b655-wglrh\" (UID: \"cc87d589-adb7-4000-8528-30bc1760d41b\") " pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:54.056594 master-0 kubenswrapper[15202]: I0319 09:41:54.056541 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/cc87d589-adb7-4000-8528-30bc1760d41b-os-client-config\") pod \"nova-console-poller-676c49b655-wglrh\" (UID: \"cc87d589-adb7-4000-8528-30bc1760d41b\") " pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:54.072631 master-0 kubenswrapper[15202]: I0319 09:41:54.072565 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpn4c\" (UniqueName: \"kubernetes.io/projected/cc87d589-adb7-4000-8528-30bc1760d41b-kube-api-access-zpn4c\") pod \"nova-console-poller-676c49b655-wglrh\" (UID: \"cc87d589-adb7-4000-8528-30bc1760d41b\") " pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:54.228951 master-0 kubenswrapper[15202]: I0319 09:41:54.228890 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" Mar 19 09:41:54.717169 master-0 kubenswrapper[15202]: I0319 09:41:54.717118 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-676c49b655-wglrh"] Mar 19 09:41:54.718931 master-0 kubenswrapper[15202]: W0319 09:41:54.718852 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc87d589_adb7_4000_8528_30bc1760d41b.slice/crio-8a5844259098ff3ca9cac44418a803d7ad0553f19de74536733b23d3e1f9a647 WatchSource:0}: Error finding container 8a5844259098ff3ca9cac44418a803d7ad0553f19de74536733b23d3e1f9a647: Status 404 returned error can't find the container with id 8a5844259098ff3ca9cac44418a803d7ad0553f19de74536733b23d3e1f9a647 Mar 19 09:41:55.063877 master-0 kubenswrapper[15202]: I0319 09:41:55.063710 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" event={"ID":"cc87d589-adb7-4000-8528-30bc1760d41b","Type":"ContainerStarted","Data":"8a5844259098ff3ca9cac44418a803d7ad0553f19de74536733b23d3e1f9a647"} Mar 19 09:42:01.122393 master-0 kubenswrapper[15202]: I0319 09:42:01.122149 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" event={"ID":"cc87d589-adb7-4000-8528-30bc1760d41b","Type":"ContainerStarted","Data":"3835e4347a58627e6cff5934d226154de2ab26fb77679d9fa9d2c7ec21989eb4"} Mar 19 09:42:01.122393 master-0 kubenswrapper[15202]: I0319 09:42:01.122234 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" event={"ID":"cc87d589-adb7-4000-8528-30bc1760d41b","Type":"ContainerStarted","Data":"26288fc2462c7cddfc59c04c05fd710191ca4e7a36a11819ff50357f22e67775"} Mar 19 09:42:01.151241 master-0 kubenswrapper[15202]: I0319 09:42:01.151125 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-676c49b655-wglrh" podStartSLOduration=2.118035873 podStartE2EDuration="8.151096993s" podCreationTimestamp="2026-03-19 09:41:53 +0000 UTC" firstStartedPulling="2026-03-19 09:41:54.722817834 +0000 UTC m=+1032.108232650" lastFinishedPulling="2026-03-19 09:42:00.755878924 +0000 UTC m=+1038.141293770" observedRunningTime="2026-03-19 09:42:01.144862159 +0000 UTC m=+1038.530277105" watchObservedRunningTime="2026-03-19 09:42:01.151096993 +0000 UTC m=+1038.536511819" Mar 19 09:42:25.584053 master-0 kubenswrapper[15202]: I0319 09:42:25.583963 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj"] Mar 19 09:42:25.588659 master-0 kubenswrapper[15202]: I0319 09:42:25.586652 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.606152 master-0 kubenswrapper[15202]: I0319 09:42:25.604250 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj"] Mar 19 09:42:25.738227 master-0 kubenswrapper[15202]: I0319 09:42:25.738157 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/bac7ec7f-5721-498e-8bdd-b419ade1a216-os-client-config\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.738634 master-0 kubenswrapper[15202]: I0319 09:42:25.738606 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76sd\" (UniqueName: \"kubernetes.io/projected/bac7ec7f-5721-498e-8bdd-b419ade1a216-kube-api-access-w76sd\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.738948 master-0 kubenswrapper[15202]: I0319 09:42:25.738928 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/bac7ec7f-5721-498e-8bdd-b419ade1a216-nova-console-recordings-pv\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.840587 master-0 kubenswrapper[15202]: I0319 09:42:25.840414 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/bac7ec7f-5721-498e-8bdd-b419ade1a216-os-client-config\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.840844 master-0 kubenswrapper[15202]: I0319 09:42:25.840729 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w76sd\" (UniqueName: \"kubernetes.io/projected/bac7ec7f-5721-498e-8bdd-b419ade1a216-kube-api-access-w76sd\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.840844 master-0 kubenswrapper[15202]: I0319 09:42:25.840806 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/bac7ec7f-5721-498e-8bdd-b419ade1a216-nova-console-recordings-pv\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.847036 master-0 kubenswrapper[15202]: I0319 09:42:25.846975 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/bac7ec7f-5721-498e-8bdd-b419ade1a216-os-client-config\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:25.871576 master-0 kubenswrapper[15202]: I0319 09:42:25.871493 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76sd\" (UniqueName: \"kubernetes.io/projected/bac7ec7f-5721-498e-8bdd-b419ade1a216-kube-api-access-w76sd\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:26.488530 master-0 kubenswrapper[15202]: I0319 09:42:26.488346 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/bac7ec7f-5721-498e-8bdd-b419ade1a216-nova-console-recordings-pv\") pod \"nova-console-recorder-6d7748fc8c-9phbj\" (UID: \"bac7ec7f-5721-498e-8bdd-b419ade1a216\") " pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:26.522077 master-0 kubenswrapper[15202]: I0319 09:42:26.520914 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" Mar 19 09:42:26.964418 master-0 kubenswrapper[15202]: I0319 09:42:26.964368 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj"] Mar 19 09:42:26.968411 master-0 kubenswrapper[15202]: W0319 09:42:26.967714 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbac7ec7f_5721_498e_8bdd_b419ade1a216.slice/crio-e2b6a83e2bca736c89ed5e99723887612d434e8e05be2d0aab043a3a59e57d20 WatchSource:0}: Error finding container e2b6a83e2bca736c89ed5e99723887612d434e8e05be2d0aab043a3a59e57d20: Status 404 returned error can't find the container with id e2b6a83e2bca736c89ed5e99723887612d434e8e05be2d0aab043a3a59e57d20 Mar 19 09:42:27.397065 master-0 kubenswrapper[15202]: I0319 09:42:27.394170 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" event={"ID":"bac7ec7f-5721-498e-8bdd-b419ade1a216","Type":"ContainerStarted","Data":"e2b6a83e2bca736c89ed5e99723887612d434e8e05be2d0aab043a3a59e57d20"} Mar 19 09:42:35.471280 master-0 kubenswrapper[15202]: I0319 09:42:35.471189 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" event={"ID":"bac7ec7f-5721-498e-8bdd-b419ade1a216","Type":"ContainerStarted","Data":"405266eb93da60fd91a159405d814d970b4e2f3ead167acdbc51d8b46d31f7fb"} Mar 19 09:42:36.487889 master-0 kubenswrapper[15202]: I0319 09:42:36.487803 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" event={"ID":"bac7ec7f-5721-498e-8bdd-b419ade1a216","Type":"ContainerStarted","Data":"886a6419375264fac295c1db3ece0e8c4cae35a7ddac7fd0a713a0ef70f8d21d"} Mar 19 09:42:36.527443 master-0 kubenswrapper[15202]: I0319 09:42:36.527177 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-6d7748fc8c-9phbj" podStartSLOduration=2.870031272 podStartE2EDuration="11.52714031s" podCreationTimestamp="2026-03-19 09:42:25 +0000 UTC" firstStartedPulling="2026-03-19 09:42:26.970125381 +0000 UTC m=+1064.355540197" lastFinishedPulling="2026-03-19 09:42:35.627234409 +0000 UTC m=+1073.012649235" observedRunningTime="2026-03-19 09:42:36.521128663 +0000 UTC m=+1073.906543489" watchObservedRunningTime="2026-03-19 09:42:36.52714031 +0000 UTC m=+1073.912555136" Mar 19 09:43:07.888048 master-0 kubenswrapper[15202]: I0319 09:43:07.887963 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/metal3-546c754db-8r9wh"] Mar 19 09:43:07.890546 master-0 kubenswrapper[15202]: I0319 09:43:07.890515 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:07.892618 master-0 kubenswrapper[15202]: I0319 09:43:07.892552 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"metal3-ironic-tls" Mar 19 09:43:07.892743 master-0 kubenswrapper[15202]: I0319 09:43:07.892560 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"metal3-ironic-password" Mar 19 09:43:07.895548 master-0 kubenswrapper[15202]: I0319 09:43:07.895506 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"baremetal-operator-webhook-server-cert" Mar 19 09:43:07.903873 master-0 kubenswrapper[15202]: I0319 09:43:07.903814 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cbo-trusted-ca" Mar 19 09:43:08.000169 master-0 kubenswrapper[15202]: I0319 09:43:08.000091 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-ironic-tls\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.000169 master-0 kubenswrapper[15202]: I0319 09:43:08.000161 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-shared-image-cache\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.000581 master-0 kubenswrapper[15202]: I0319 09:43:08.000252 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-vmedia-tls\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-vmedia-tls\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.000862 master-0 kubenswrapper[15202]: I0319 09:43:08.000794 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stlf9\" (UniqueName: \"kubernetes.io/projected/90e6a8d7-86b3-4082-a6f7-4d1001e48563-kube-api-access-stlf9\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.001073 master-0 kubenswrapper[15202]: I0319 09:43:08.001020 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-ironic-basic-auth\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.001559 master-0 kubenswrapper[15202]: I0319 09:43:08.001412 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6a8d7-86b3-4082-a6f7-4d1001e48563-trusted-ca\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.001559 master-0 kubenswrapper[15202]: I0319 09:43:08.001517 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-shared\" (UniqueName: \"kubernetes.io/empty-dir/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-shared\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.103673 master-0 kubenswrapper[15202]: I0319 09:43:08.103571 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-shared\" (UniqueName: \"kubernetes.io/empty-dir/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-shared\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104157 master-0 kubenswrapper[15202]: I0319 09:43:08.103904 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-ironic-tls\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104157 master-0 kubenswrapper[15202]: I0319 09:43:08.103996 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-shared-image-cache\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104157 master-0 kubenswrapper[15202]: I0319 09:43:08.104051 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-vmedia-tls\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-vmedia-tls\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104434 master-0 kubenswrapper[15202]: I0319 09:43:08.104173 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stlf9\" (UniqueName: \"kubernetes.io/projected/90e6a8d7-86b3-4082-a6f7-4d1001e48563-kube-api-access-stlf9\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104434 master-0 kubenswrapper[15202]: I0319 09:43:08.104279 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-ironic-basic-auth\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104434 master-0 kubenswrapper[15202]: I0319 09:43:08.104394 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-shared-image-cache\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.104719 master-0 kubenswrapper[15202]: I0319 09:43:08.104590 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-shared\" (UniqueName: \"kubernetes.io/empty-dir/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-shared\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.105342 master-0 kubenswrapper[15202]: I0319 09:43:08.105280 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6a8d7-86b3-4082-a6f7-4d1001e48563-trusted-ca\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.107107 master-0 kubenswrapper[15202]: I0319 09:43:08.107050 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6a8d7-86b3-4082-a6f7-4d1001e48563-trusted-ca\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.107853 master-0 kubenswrapper[15202]: I0319 09:43:08.107799 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-ironic-basic-auth\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.107853 master-0 kubenswrapper[15202]: I0319 09:43:08.107838 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-ironic-tls\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.108185 master-0 kubenswrapper[15202]: I0319 09:43:08.107946 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-vmedia-tls\" (UniqueName: \"kubernetes.io/secret/90e6a8d7-86b3-4082-a6f7-4d1001e48563-metal3-vmedia-tls\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.120496 master-0 kubenswrapper[15202]: I0319 09:43:08.120426 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stlf9\" (UniqueName: \"kubernetes.io/projected/90e6a8d7-86b3-4082-a6f7-4d1001e48563-kube-api-access-stlf9\") pod \"metal3-546c754db-8r9wh\" (UID: \"90e6a8d7-86b3-4082-a6f7-4d1001e48563\") " pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.210747 master-0 kubenswrapper[15202]: I0319 09:43:08.210646 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-546c754db-8r9wh" Mar 19 09:43:08.238196 master-0 kubenswrapper[15202]: W0319 09:43:08.238110 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e6a8d7_86b3_4082_a6f7_4d1001e48563.slice/crio-8b9dc69a25c7728842d3057a697fa4c9344093b6ba2a0d2497916db4e7adedc1 WatchSource:0}: Error finding container 8b9dc69a25c7728842d3057a697fa4c9344093b6ba2a0d2497916db4e7adedc1: Status 404 returned error can't find the container with id 8b9dc69a25c7728842d3057a697fa4c9344093b6ba2a0d2497916db4e7adedc1 Mar 19 09:43:08.667982 master-0 kubenswrapper[15202]: I0319 09:43:08.667824 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr"] Mar 19 09:43:08.669083 master-0 kubenswrapper[15202]: I0319 09:43:08.669050 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.684203 master-0 kubenswrapper[15202]: I0319 09:43:08.684049 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr"] Mar 19 09:43:08.819191 master-0 kubenswrapper[15202]: I0319 09:43:08.819083 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.819191 master-0 kubenswrapper[15202]: I0319 09:43:08.819182 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-metal3-ironic-tls\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.819619 master-0 kubenswrapper[15202]: I0319 09:43:08.819365 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f4c14-046a-493e-b6b8-1e821c66b504-trusted-ca\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.819619 master-0 kubenswrapper[15202]: I0319 09:43:08.819604 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-metal3-ironic-basic-auth\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.819729 master-0 kubenswrapper[15202]: I0319 09:43:08.819682 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrw2\" (UniqueName: \"kubernetes.io/projected/bf7f4c14-046a-493e-b6b8-1e821c66b504-kube-api-access-rqrw2\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.846930 master-0 kubenswrapper[15202]: I0319 09:43:08.846820 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-546c754db-8r9wh" event={"ID":"90e6a8d7-86b3-4082-a6f7-4d1001e48563","Type":"ContainerStarted","Data":"8b9dc69a25c7728842d3057a697fa4c9344093b6ba2a0d2497916db4e7adedc1"} Mar 19 09:43:08.922857 master-0 kubenswrapper[15202]: I0319 09:43:08.922646 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrw2\" (UniqueName: \"kubernetes.io/projected/bf7f4c14-046a-493e-b6b8-1e821c66b504-kube-api-access-rqrw2\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.923498 master-0 kubenswrapper[15202]: I0319 09:43:08.923079 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.923498 master-0 kubenswrapper[15202]: I0319 09:43:08.923215 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-metal3-ironic-tls\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.923498 master-0 kubenswrapper[15202]: E0319 09:43:08.923313 15202 secret.go:189] Couldn't get secret openshift-machine-api/baremetal-operator-webhook-server-cert: secret "baremetal-operator-webhook-server-cert" not found Mar 19 09:43:08.923498 master-0 kubenswrapper[15202]: I0319 09:43:08.923327 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f4c14-046a-493e-b6b8-1e821c66b504-trusted-ca\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.923498 master-0 kubenswrapper[15202]: E0319 09:43:08.923407 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert podName:bf7f4c14-046a-493e-b6b8-1e821c66b504 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:09.423376063 +0000 UTC m=+1106.808790919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert") pod "metal3-baremetal-operator-78474bdc48-lpxgr" (UID: "bf7f4c14-046a-493e-b6b8-1e821c66b504") : secret "baremetal-operator-webhook-server-cert" not found Mar 19 09:43:08.923697 master-0 kubenswrapper[15202]: I0319 09:43:08.923657 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-metal3-ironic-basic-auth\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.924625 master-0 kubenswrapper[15202]: I0319 09:43:08.924589 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf7f4c14-046a-493e-b6b8-1e821c66b504-trusted-ca\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.928421 master-0 kubenswrapper[15202]: I0319 09:43:08.928260 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-metal3-ironic-tls\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.928421 master-0 kubenswrapper[15202]: I0319 09:43:08.928294 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-metal3-ironic-basic-auth\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:08.957678 master-0 kubenswrapper[15202]: I0319 09:43:08.957622 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrw2\" (UniqueName: \"kubernetes.io/projected/bf7f4c14-046a-493e-b6b8-1e821c66b504-kube-api-access-rqrw2\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:09.448676 master-0 kubenswrapper[15202]: I0319 09:43:09.448601 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:09.448944 master-0 kubenswrapper[15202]: E0319 09:43:09.448837 15202 secret.go:189] Couldn't get secret openshift-machine-api/baremetal-operator-webhook-server-cert: secret "baremetal-operator-webhook-server-cert" not found Mar 19 09:43:09.448944 master-0 kubenswrapper[15202]: E0319 09:43:09.448909 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert podName:bf7f4c14-046a-493e-b6b8-1e821c66b504 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:10.448888906 +0000 UTC m=+1107.834303722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert") pod "metal3-baremetal-operator-78474bdc48-lpxgr" (UID: "bf7f4c14-046a-493e-b6b8-1e821c66b504") : secret "baremetal-operator-webhook-server-cert" not found Mar 19 09:43:10.502689 master-0 kubenswrapper[15202]: I0319 09:43:10.472443 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:10.505366 master-0 kubenswrapper[15202]: I0319 09:43:10.505314 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf7f4c14-046a-493e-b6b8-1e821c66b504-cert\") pod \"metal3-baremetal-operator-78474bdc48-lpxgr\" (UID: \"bf7f4c14-046a-493e-b6b8-1e821c66b504\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:10.789612 master-0 kubenswrapper[15202]: I0319 09:43:10.789481 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" Mar 19 09:43:11.388076 master-0 kubenswrapper[15202]: I0319 09:43:11.388015 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr"] Mar 19 09:43:11.433233 master-0 kubenswrapper[15202]: I0319 09:43:11.433180 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj"] Mar 19 09:43:11.438046 master-0 kubenswrapper[15202]: I0319 09:43:11.437989 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.440717 master-0 kubenswrapper[15202]: I0319 09:43:11.440682 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"pull-secret" Mar 19 09:43:11.457924 master-0 kubenswrapper[15202]: I0319 09:43:11.443460 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj"] Mar 19 09:43:11.590981 master-0 kubenswrapper[15202]: I0319 09:43:11.590900 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"user-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-user-ca-bundle\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.591635 master-0 kubenswrapper[15202]: I0319 09:43:11.590996 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/127c2b98-5be4-46f3-95d6-1901fab637ff-trusted-ca\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.591635 master-0 kubenswrapper[15202]: I0319 09:43:11.591101 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmdx9\" (UniqueName: \"kubernetes.io/projected/127c2b98-5be4-46f3-95d6-1901fab637ff-kube-api-access-bmdx9\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.591635 master-0 kubenswrapper[15202]: I0319 09:43:11.591143 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ironic-agent-pull-secret\" (UniqueName: \"kubernetes.io/secret/127c2b98-5be4-46f3-95d6-1901fab637ff-ironic-agent-pull-secret\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.591635 master-0 kubenswrapper[15202]: I0319 09:43:11.591166 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-metal3-shared-image-cache\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.591635 master-0 kubenswrapper[15202]: I0319 09:43:11.591184 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-image-customization-volume\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-metal3-image-customization-volume\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.692758 master-0 kubenswrapper[15202]: I0319 09:43:11.692694 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmdx9\" (UniqueName: \"kubernetes.io/projected/127c2b98-5be4-46f3-95d6-1901fab637ff-kube-api-access-bmdx9\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.692946 master-0 kubenswrapper[15202]: I0319 09:43:11.692784 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ironic-agent-pull-secret\" (UniqueName: \"kubernetes.io/secret/127c2b98-5be4-46f3-95d6-1901fab637ff-ironic-agent-pull-secret\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.693281 master-0 kubenswrapper[15202]: I0319 09:43:11.693004 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-metal3-shared-image-cache\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.693368 master-0 kubenswrapper[15202]: I0319 09:43:11.693301 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-image-customization-volume\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-metal3-image-customization-volume\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.693419 master-0 kubenswrapper[15202]: I0319 09:43:11.693366 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-metal3-shared-image-cache\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.693629 master-0 kubenswrapper[15202]: I0319 09:43:11.693574 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-image-customization-volume\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-metal3-image-customization-volume\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.693755 master-0 kubenswrapper[15202]: I0319 09:43:11.693728 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"user-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-user-ca-bundle\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.693830 master-0 kubenswrapper[15202]: I0319 09:43:11.693808 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/127c2b98-5be4-46f3-95d6-1901fab637ff-trusted-ca\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.694182 master-0 kubenswrapper[15202]: I0319 09:43:11.694105 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"user-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/127c2b98-5be4-46f3-95d6-1901fab637ff-user-ca-bundle\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.695380 master-0 kubenswrapper[15202]: I0319 09:43:11.695186 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/127c2b98-5be4-46f3-95d6-1901fab637ff-trusted-ca\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.697201 master-0 kubenswrapper[15202]: I0319 09:43:11.697148 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ironic-agent-pull-secret\" (UniqueName: \"kubernetes.io/secret/127c2b98-5be4-46f3-95d6-1901fab637ff-ironic-agent-pull-secret\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.709012 master-0 kubenswrapper[15202]: I0319 09:43:11.708951 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmdx9\" (UniqueName: \"kubernetes.io/projected/127c2b98-5be4-46f3-95d6-1901fab637ff-kube-api-access-bmdx9\") pod \"metal3-image-customization-7b5d8dfcfd-gjzrj\" (UID: \"127c2b98-5be4-46f3-95d6-1901fab637ff\") " pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.781925 master-0 kubenswrapper[15202]: I0319 09:43:11.781846 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" Mar 19 09:43:11.873213 master-0 kubenswrapper[15202]: I0319 09:43:11.873152 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" event={"ID":"bf7f4c14-046a-493e-b6b8-1e821c66b504","Type":"ContainerStarted","Data":"22cce2cb640fc0d3d272f0c5c5817d8449a2ebe7159e91369caae5ffca4d7c18"} Mar 19 09:43:12.229766 master-0 kubenswrapper[15202]: I0319 09:43:12.229681 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/ironic-proxy-kc5xl"] Mar 19 09:43:12.230950 master-0 kubenswrapper[15202]: I0319 09:43:12.230924 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.243155 master-0 kubenswrapper[15202]: I0319 09:43:12.243117 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj"] Mar 19 09:43:12.306859 master-0 kubenswrapper[15202]: I0319 09:43:12.306791 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-metal3-ironic-tls\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.306859 master-0 kubenswrapper[15202]: I0319 09:43:12.306860 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j67g\" (UniqueName: \"kubernetes.io/projected/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-kube-api-access-9j67g\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.307122 master-0 kubenswrapper[15202]: I0319 09:43:12.306947 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-trusted-ca\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.408608 master-0 kubenswrapper[15202]: I0319 09:43:12.408532 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-trusted-ca\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.408887 master-0 kubenswrapper[15202]: I0319 09:43:12.408708 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-metal3-ironic-tls\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.408887 master-0 kubenswrapper[15202]: I0319 09:43:12.408748 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j67g\" (UniqueName: \"kubernetes.io/projected/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-kube-api-access-9j67g\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.409694 master-0 kubenswrapper[15202]: I0319 09:43:12.409647 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-trusted-ca\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.415017 master-0 kubenswrapper[15202]: I0319 09:43:12.414957 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-metal3-ironic-tls\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.429249 master-0 kubenswrapper[15202]: I0319 09:43:12.429188 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j67g\" (UniqueName: \"kubernetes.io/projected/edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1-kube-api-access-9j67g\") pod \"ironic-proxy-kc5xl\" (UID: \"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1\") " pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.571096 master-0 kubenswrapper[15202]: I0319 09:43:12.570907 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/ironic-proxy-kc5xl" Mar 19 09:43:12.614147 master-0 kubenswrapper[15202]: W0319 09:43:12.614062 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb10ae4_b456_4b8f_8ed0_95b53ba1bdf1.slice/crio-47499c3039f73fbe8f1c0bd0a4b17c1e2c4556987dcce35c6750d0975e38a9a9 WatchSource:0}: Error finding container 47499c3039f73fbe8f1c0bd0a4b17c1e2c4556987dcce35c6750d0975e38a9a9: Status 404 returned error can't find the container with id 47499c3039f73fbe8f1c0bd0a4b17c1e2c4556987dcce35c6750d0975e38a9a9 Mar 19 09:43:12.883753 master-0 kubenswrapper[15202]: I0319 09:43:12.883608 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/ironic-proxy-kc5xl" event={"ID":"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1","Type":"ContainerStarted","Data":"47499c3039f73fbe8f1c0bd0a4b17c1e2c4556987dcce35c6750d0975e38a9a9"} Mar 19 09:43:12.885356 master-0 kubenswrapper[15202]: I0319 09:43:12.885302 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" event={"ID":"127c2b98-5be4-46f3-95d6-1901fab637ff","Type":"ContainerStarted","Data":"b7e341ab969fcd8bd133e7848344f707ee9509e3b189964cb45d6044bc9e317d"} Mar 19 09:43:14.907649 master-0 kubenswrapper[15202]: I0319 09:43:14.907537 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" event={"ID":"bf7f4c14-046a-493e-b6b8-1e821c66b504","Type":"ContainerStarted","Data":"54686b6ede4d8ab1ec0bb94a464a5ba1f330335f1a9b94d2f6ad9f18afeb872c"} Mar 19 09:43:14.937535 master-0 kubenswrapper[15202]: I0319 09:43:14.937412 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-lpxgr" podStartSLOduration=3.986065235 podStartE2EDuration="6.937395326s" podCreationTimestamp="2026-03-19 09:43:08 +0000 UTC" firstStartedPulling="2026-03-19 09:43:11.41765568 +0000 UTC m=+1108.803070496" lastFinishedPulling="2026-03-19 09:43:14.368985771 +0000 UTC m=+1111.754400587" observedRunningTime="2026-03-19 09:43:14.93267201 +0000 UTC m=+1112.318086826" watchObservedRunningTime="2026-03-19 09:43:14.937395326 +0000 UTC m=+1112.322810142" Mar 19 09:43:25.002163 master-0 kubenswrapper[15202]: I0319 09:43:25.002035 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/ironic-proxy-kc5xl" event={"ID":"edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1","Type":"ContainerStarted","Data":"1153b3abed40c757dce6508f50ca214311ee79b73978a0b67530ca034b9be919"} Mar 19 09:43:25.038169 master-0 kubenswrapper[15202]: I0319 09:43:25.037986 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/ironic-proxy-kc5xl" podStartSLOduration=1.599334563 podStartE2EDuration="13.037958549s" podCreationTimestamp="2026-03-19 09:43:12 +0000 UTC" firstStartedPulling="2026-03-19 09:43:12.624973117 +0000 UTC m=+1110.010387943" lastFinishedPulling="2026-03-19 09:43:24.063597113 +0000 UTC m=+1121.449011929" observedRunningTime="2026-03-19 09:43:25.022288395 +0000 UTC m=+1122.407703221" watchObservedRunningTime="2026-03-19 09:43:25.037958549 +0000 UTC m=+1122.423373365" Mar 19 09:43:45.176032 master-0 kubenswrapper[15202]: I0319 09:43:45.175965 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/0.log" Mar 19 09:43:45.176032 master-0 kubenswrapper[15202]: I0319 09:43:45.176033 15202 generic.go:334] "Generic (PLEG): container finished" podID="127c2b98-5be4-46f3-95d6-1901fab637ff" containerID="0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2" exitCode=1 Mar 19 09:43:45.176726 master-0 kubenswrapper[15202]: I0319 09:43:45.176083 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" event={"ID":"127c2b98-5be4-46f3-95d6-1901fab637ff","Type":"ContainerDied","Data":"0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2"} Mar 19 09:43:45.179506 master-0 kubenswrapper[15202]: I0319 09:43:45.179297 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-546c754db-8r9wh" event={"ID":"90e6a8d7-86b3-4082-a6f7-4d1001e48563","Type":"ContainerStarted","Data":"f4cb52c4e655dea3ab58166e8c7340423e8f91136eed85f8e765a0a7aa1c8c0a"} Mar 19 09:43:46.221018 master-0 kubenswrapper[15202]: I0319 09:43:46.220944 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/0.log" Mar 19 09:43:46.221639 master-0 kubenswrapper[15202]: I0319 09:43:46.221271 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" event={"ID":"127c2b98-5be4-46f3-95d6-1901fab637ff","Type":"ContainerStarted","Data":"dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99"} Mar 19 09:43:47.242328 master-0 kubenswrapper[15202]: I0319 09:43:47.242233 15202 generic.go:334] "Generic (PLEG): container finished" podID="90e6a8d7-86b3-4082-a6f7-4d1001e48563" containerID="f4cb52c4e655dea3ab58166e8c7340423e8f91136eed85f8e765a0a7aa1c8c0a" exitCode=0 Mar 19 09:43:47.243416 master-0 kubenswrapper[15202]: I0319 09:43:47.242749 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-546c754db-8r9wh" event={"ID":"90e6a8d7-86b3-4082-a6f7-4d1001e48563","Type":"ContainerDied","Data":"f4cb52c4e655dea3ab58166e8c7340423e8f91136eed85f8e765a0a7aa1c8c0a"} Mar 19 09:43:48.251117 master-0 kubenswrapper[15202]: I0319 09:43:48.251048 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-546c754db-8r9wh" event={"ID":"90e6a8d7-86b3-4082-a6f7-4d1001e48563","Type":"ContainerStarted","Data":"3ed61417c7c09dc0345c952ff9d89cc3b451c8fa81c918d7bb58482fe11b39fa"} Mar 19 09:43:48.252771 master-0 kubenswrapper[15202]: I0319 09:43:48.252737 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/1.log" Mar 19 09:43:48.253351 master-0 kubenswrapper[15202]: I0319 09:43:48.253312 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/0.log" Mar 19 09:43:48.253407 master-0 kubenswrapper[15202]: I0319 09:43:48.253363 15202 generic.go:334] "Generic (PLEG): container finished" podID="127c2b98-5be4-46f3-95d6-1901fab637ff" containerID="dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99" exitCode=1 Mar 19 09:43:48.253407 master-0 kubenswrapper[15202]: I0319 09:43:48.253391 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" event={"ID":"127c2b98-5be4-46f3-95d6-1901fab637ff","Type":"ContainerDied","Data":"dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99"} Mar 19 09:43:48.253491 master-0 kubenswrapper[15202]: I0319 09:43:48.253422 15202 scope.go:117] "RemoveContainer" containerID="0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2" Mar 19 09:43:48.254203 master-0 kubenswrapper[15202]: I0319 09:43:48.254174 15202 scope.go:117] "RemoveContainer" containerID="0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2" Mar 19 09:43:48.635526 master-0 kubenswrapper[15202]: E0319 09:43:48.635389 15202 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api_127c2b98-5be4-46f3-95d6-1901fab637ff_0 in pod sandbox b7e341ab969fcd8bd133e7848344f707ee9509e3b189964cb45d6044bc9e317d from index: no such id: '0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2'" containerID="0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2" Mar 19 09:43:48.635763 master-0 kubenswrapper[15202]: E0319 09:43:48.635625 15202 kuberuntime_container.go:896] "Unhandled Error" err="failed to remove pod init container \"machine-os-images\": rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api_127c2b98-5be4-46f3-95d6-1901fab637ff_0 in pod sandbox b7e341ab969fcd8bd133e7848344f707ee9509e3b189964cb45d6044bc9e317d from index: no such id: '0686314374d6d8b741e4058b174e348f5cc9b5402b434cd8a09977911003b2b2'; Skipping pod \"metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api(127c2b98-5be4-46f3-95d6-1901fab637ff)\"" logger="UnhandledError" Mar 19 09:43:48.637676 master-0 kubenswrapper[15202]: E0319 09:43:48.637619 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-os-images\" with CrashLoopBackOff: \"back-off 10s restarting failed container=machine-os-images pod=metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api(127c2b98-5be4-46f3-95d6-1901fab637ff)\"" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" podUID="127c2b98-5be4-46f3-95d6-1901fab637ff" Mar 19 09:43:49.274429 master-0 kubenswrapper[15202]: I0319 09:43:49.274316 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-546c754db-8r9wh" event={"ID":"90e6a8d7-86b3-4082-a6f7-4d1001e48563","Type":"ContainerStarted","Data":"48fa434bba7d4f4150ec99dddd18eabd2424969225c31ee83ff1584a7b2452a2"} Mar 19 09:43:49.277059 master-0 kubenswrapper[15202]: I0319 09:43:49.276969 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/1.log" Mar 19 09:43:50.291429 master-0 kubenswrapper[15202]: I0319 09:43:50.291278 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-546c754db-8r9wh" event={"ID":"90e6a8d7-86b3-4082-a6f7-4d1001e48563","Type":"ContainerStarted","Data":"072a7352bb6d7096fb14880864d64d12d2f022e58d0bbad101db1eafcf33764f"} Mar 19 09:43:51.579895 master-0 kubenswrapper[15202]: I0319 09:43:51.579787 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/metal3-546c754db-8r9wh" podStartSLOduration=8.889200278 podStartE2EDuration="44.579769442s" podCreationTimestamp="2026-03-19 09:43:07 +0000 UTC" firstStartedPulling="2026-03-19 09:43:08.240855841 +0000 UTC m=+1105.626270707" lastFinishedPulling="2026-03-19 09:43:43.931425055 +0000 UTC m=+1141.316839871" observedRunningTime="2026-03-19 09:43:51.578534402 +0000 UTC m=+1148.963949228" watchObservedRunningTime="2026-03-19 09:43:51.579769442 +0000 UTC m=+1148.965184278" Mar 19 09:44:00.856929 master-0 kubenswrapper[15202]: I0319 09:44:00.856849 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4"] Mar 19 09:44:00.858601 master-0 kubenswrapper[15202]: I0319 09:44:00.858553 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:00.862733 master-0 kubenswrapper[15202]: I0319 09:44:00.862667 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-2l79b" Mar 19 09:44:00.873737 master-0 kubenswrapper[15202]: I0319 09:44:00.873687 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4"] Mar 19 09:44:00.951266 master-0 kubenswrapper[15202]: I0319 09:44:00.951185 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:00.951266 master-0 kubenswrapper[15202]: I0319 09:44:00.951261 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:00.951542 master-0 kubenswrapper[15202]: I0319 09:44:00.951342 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2nph\" (UniqueName: \"kubernetes.io/projected/293d8dea-835a-4a95-8f41-6e6ea369b85a-kube-api-access-r2nph\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.053129 master-0 kubenswrapper[15202]: I0319 09:44:01.053048 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.053129 master-0 kubenswrapper[15202]: I0319 09:44:01.053113 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.053495 master-0 kubenswrapper[15202]: I0319 09:44:01.053181 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2nph\" (UniqueName: \"kubernetes.io/projected/293d8dea-835a-4a95-8f41-6e6ea369b85a-kube-api-access-r2nph\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.053731 master-0 kubenswrapper[15202]: I0319 09:44:01.053686 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.053838 master-0 kubenswrapper[15202]: I0319 09:44:01.053799 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.082175 master-0 kubenswrapper[15202]: I0319 09:44:01.082087 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2nph\" (UniqueName: \"kubernetes.io/projected/293d8dea-835a-4a95-8f41-6e6ea369b85a-kube-api-access-r2nph\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.183330 master-0 kubenswrapper[15202]: I0319 09:44:01.183263 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:01.650864 master-0 kubenswrapper[15202]: I0319 09:44:01.649921 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4"] Mar 19 09:44:01.658716 master-0 kubenswrapper[15202]: W0319 09:44:01.658663 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod293d8dea_835a_4a95_8f41_6e6ea369b85a.slice/crio-132cbe964354ee1a3e17c7e48d2370c47682d34e9417dc82375590bbaf491ad1 WatchSource:0}: Error finding container 132cbe964354ee1a3e17c7e48d2370c47682d34e9417dc82375590bbaf491ad1: Status 404 returned error can't find the container with id 132cbe964354ee1a3e17c7e48d2370c47682d34e9417dc82375590bbaf491ad1 Mar 19 09:44:02.603852 master-0 kubenswrapper[15202]: I0319 09:44:02.603801 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/1.log" Mar 19 09:44:02.604540 master-0 kubenswrapper[15202]: I0319 09:44:02.603861 15202 generic.go:334] "Generic (PLEG): container finished" podID="127c2b98-5be4-46f3-95d6-1901fab637ff" containerID="7cd0cff70285882000c8f225792f21809b02f776906de3a7671003d0e95a26d8" exitCode=0 Mar 19 09:44:02.604540 master-0 kubenswrapper[15202]: I0319 09:44:02.603925 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" event={"ID":"127c2b98-5be4-46f3-95d6-1901fab637ff","Type":"ContainerDied","Data":"7cd0cff70285882000c8f225792f21809b02f776906de3a7671003d0e95a26d8"} Mar 19 09:44:02.604540 master-0 kubenswrapper[15202]: I0319 09:44:02.603960 15202 scope.go:117] "RemoveContainer" containerID="dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99" Mar 19 09:44:02.605311 master-0 kubenswrapper[15202]: I0319 09:44:02.605195 15202 scope.go:117] "RemoveContainer" containerID="dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99" Mar 19 09:44:02.607205 master-0 kubenswrapper[15202]: I0319 09:44:02.607161 15202 generic.go:334] "Generic (PLEG): container finished" podID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerID="5eccdad75cce8fccec26b4a1d7323d0d39e68befc6faf014c321462ed20620fd" exitCode=0 Mar 19 09:44:02.607281 master-0 kubenswrapper[15202]: I0319 09:44:02.607210 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" event={"ID":"293d8dea-835a-4a95-8f41-6e6ea369b85a","Type":"ContainerDied","Data":"5eccdad75cce8fccec26b4a1d7323d0d39e68befc6faf014c321462ed20620fd"} Mar 19 09:44:02.607281 master-0 kubenswrapper[15202]: I0319 09:44:02.607255 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" event={"ID":"293d8dea-835a-4a95-8f41-6e6ea369b85a","Type":"ContainerStarted","Data":"132cbe964354ee1a3e17c7e48d2370c47682d34e9417dc82375590bbaf491ad1"} Mar 19 09:44:02.624367 master-0 kubenswrapper[15202]: E0319 09:44:02.624305 15202 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api_127c2b98-5be4-46f3-95d6-1901fab637ff_1 in pod sandbox b7e341ab969fcd8bd133e7848344f707ee9509e3b189964cb45d6044bc9e317d from index: no such id: 'dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99'" containerID="dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99" Mar 19 09:44:02.624495 master-0 kubenswrapper[15202]: E0319 09:44:02.624383 15202 kuberuntime_container.go:896] "Unhandled Error" err="failed to remove pod init container \"machine-os-images\": rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api_127c2b98-5be4-46f3-95d6-1901fab637ff_1 in pod sandbox b7e341ab969fcd8bd133e7848344f707ee9509e3b189964cb45d6044bc9e317d from index: no such id: 'dd93ddcdfe863507be2c0c3e9bec5f2c1ff84086354b673df3a4ae01a224ac99'; Skipping pod \"metal3-image-customization-7b5d8dfcfd-gjzrj_openshift-machine-api(127c2b98-5be4-46f3-95d6-1901fab637ff)\"" logger="UnhandledError" Mar 19 09:44:04.630570 master-0 kubenswrapper[15202]: I0319 09:44:04.630515 15202 generic.go:334] "Generic (PLEG): container finished" podID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerID="4545a02142d1bf8a430f5cce975cf507a4b27cb65d67083454d3d223399544ee" exitCode=0 Mar 19 09:44:04.630570 master-0 kubenswrapper[15202]: I0319 09:44:04.630563 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" event={"ID":"293d8dea-835a-4a95-8f41-6e6ea369b85a","Type":"ContainerDied","Data":"4545a02142d1bf8a430f5cce975cf507a4b27cb65d67083454d3d223399544ee"} Mar 19 09:44:06.646597 master-0 kubenswrapper[15202]: I0319 09:44:06.646512 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" event={"ID":"127c2b98-5be4-46f3-95d6-1901fab637ff","Type":"ContainerStarted","Data":"0e72662b49b19fffaf16f81c2bb4595091c1a417219ada3e452c074a29b4e089"} Mar 19 09:44:06.651576 master-0 kubenswrapper[15202]: I0319 09:44:06.651523 15202 generic.go:334] "Generic (PLEG): container finished" podID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerID="f31a39c40dcef9406db5f7e1e788f09f4d57a0d48dae6e9b521f09a4457d4910" exitCode=0 Mar 19 09:44:06.651576 master-0 kubenswrapper[15202]: I0319 09:44:06.651575 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" event={"ID":"293d8dea-835a-4a95-8f41-6e6ea369b85a","Type":"ContainerDied","Data":"f31a39c40dcef9406db5f7e1e788f09f4d57a0d48dae6e9b521f09a4457d4910"} Mar 19 09:44:06.683631 master-0 kubenswrapper[15202]: I0319 09:44:06.683535 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/metal3-image-customization-7b5d8dfcfd-gjzrj" podStartSLOduration=2.177452442 podStartE2EDuration="55.683517597s" podCreationTimestamp="2026-03-19 09:43:11 +0000 UTC" firstStartedPulling="2026-03-19 09:43:12.242672285 +0000 UTC m=+1109.628087101" lastFinishedPulling="2026-03-19 09:44:05.74873744 +0000 UTC m=+1163.134152256" observedRunningTime="2026-03-19 09:44:06.667925613 +0000 UTC m=+1164.053340429" watchObservedRunningTime="2026-03-19 09:44:06.683517597 +0000 UTC m=+1164.068932413" Mar 19 09:44:07.925231 master-0 kubenswrapper[15202]: I0319 09:44:07.925091 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:08.077684 master-0 kubenswrapper[15202]: I0319 09:44:08.077600 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-util\") pod \"293d8dea-835a-4a95-8f41-6e6ea369b85a\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " Mar 19 09:44:08.078096 master-0 kubenswrapper[15202]: I0319 09:44:08.077738 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-bundle\") pod \"293d8dea-835a-4a95-8f41-6e6ea369b85a\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " Mar 19 09:44:08.078096 master-0 kubenswrapper[15202]: I0319 09:44:08.077773 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2nph\" (UniqueName: \"kubernetes.io/projected/293d8dea-835a-4a95-8f41-6e6ea369b85a-kube-api-access-r2nph\") pod \"293d8dea-835a-4a95-8f41-6e6ea369b85a\" (UID: \"293d8dea-835a-4a95-8f41-6e6ea369b85a\") " Mar 19 09:44:08.078566 master-0 kubenswrapper[15202]: I0319 09:44:08.078433 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-bundle" (OuterVolumeSpecName: "bundle") pod "293d8dea-835a-4a95-8f41-6e6ea369b85a" (UID: "293d8dea-835a-4a95-8f41-6e6ea369b85a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:08.080699 master-0 kubenswrapper[15202]: I0319 09:44:08.080638 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293d8dea-835a-4a95-8f41-6e6ea369b85a-kube-api-access-r2nph" (OuterVolumeSpecName: "kube-api-access-r2nph") pod "293d8dea-835a-4a95-8f41-6e6ea369b85a" (UID: "293d8dea-835a-4a95-8f41-6e6ea369b85a"). InnerVolumeSpecName "kube-api-access-r2nph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:44:08.090807 master-0 kubenswrapper[15202]: I0319 09:44:08.090739 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-util" (OuterVolumeSpecName: "util") pod "293d8dea-835a-4a95-8f41-6e6ea369b85a" (UID: "293d8dea-835a-4a95-8f41-6e6ea369b85a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:08.179577 master-0 kubenswrapper[15202]: I0319 09:44:08.179510 15202 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:08.179577 master-0 kubenswrapper[15202]: I0319 09:44:08.179566 15202 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/293d8dea-835a-4a95-8f41-6e6ea369b85a-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:08.179814 master-0 kubenswrapper[15202]: I0319 09:44:08.179586 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2nph\" (UniqueName: \"kubernetes.io/projected/293d8dea-835a-4a95-8f41-6e6ea369b85a-kube-api-access-r2nph\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:08.691053 master-0 kubenswrapper[15202]: I0319 09:44:08.690977 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" event={"ID":"293d8dea-835a-4a95-8f41-6e6ea369b85a","Type":"ContainerDied","Data":"132cbe964354ee1a3e17c7e48d2370c47682d34e9417dc82375590bbaf491ad1"} Mar 19 09:44:08.691053 master-0 kubenswrapper[15202]: I0319 09:44:08.691038 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132cbe964354ee1a3e17c7e48d2370c47682d34e9417dc82375590bbaf491ad1" Mar 19 09:44:08.691780 master-0 kubenswrapper[15202]: I0319 09:44:08.691071 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4tqlc4" Mar 19 09:44:13.588543 master-0 kubenswrapper[15202]: I0319 09:44:13.588459 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-c6dbd8b78-6p8rh"] Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: E0319 09:44:13.588775 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="pull" Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: I0319 09:44:13.588787 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="pull" Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: E0319 09:44:13.588809 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="extract" Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: I0319 09:44:13.588815 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="extract" Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: E0319 09:44:13.588835 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="util" Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: I0319 09:44:13.588841 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="util" Mar 19 09:44:13.589149 master-0 kubenswrapper[15202]: I0319 09:44:13.588978 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="293d8dea-835a-4a95-8f41-6e6ea369b85a" containerName="extract" Mar 19 09:44:13.589492 master-0 kubenswrapper[15202]: I0319 09:44:13.589461 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.591362 master-0 kubenswrapper[15202]: I0319 09:44:13.591324 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 19 09:44:13.591755 master-0 kubenswrapper[15202]: I0319 09:44:13.591721 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 19 09:44:13.592824 master-0 kubenswrapper[15202]: I0319 09:44:13.592794 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 19 09:44:13.592886 master-0 kubenswrapper[15202]: I0319 09:44:13.592850 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 19 09:44:13.592929 master-0 kubenswrapper[15202]: I0319 09:44:13.592890 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 19 09:44:13.618373 master-0 kubenswrapper[15202]: I0319 09:44:13.616901 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-c6dbd8b78-6p8rh"] Mar 19 09:44:13.702004 master-0 kubenswrapper[15202]: I0319 09:44:13.701950 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-apiservice-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.702341 master-0 kubenswrapper[15202]: I0319 09:44:13.702319 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-socket-dir\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.702491 master-0 kubenswrapper[15202]: I0319 09:44:13.702451 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp4j5\" (UniqueName: \"kubernetes.io/projected/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-kube-api-access-wp4j5\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.702618 master-0 kubenswrapper[15202]: I0319 09:44:13.702601 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-webhook-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.702811 master-0 kubenswrapper[15202]: I0319 09:44:13.702793 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-metrics-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.804842 master-0 kubenswrapper[15202]: I0319 09:44:13.804717 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-socket-dir\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.804842 master-0 kubenswrapper[15202]: I0319 09:44:13.804826 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp4j5\" (UniqueName: \"kubernetes.io/projected/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-kube-api-access-wp4j5\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.804842 master-0 kubenswrapper[15202]: I0319 09:44:13.804872 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-webhook-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.805305 master-0 kubenswrapper[15202]: I0319 09:44:13.804972 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-metrics-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.805305 master-0 kubenswrapper[15202]: I0319 09:44:13.805064 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-apiservice-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.805305 master-0 kubenswrapper[15202]: I0319 09:44:13.805241 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-socket-dir\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.808443 master-0 kubenswrapper[15202]: I0319 09:44:13.808388 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-webhook-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.810239 master-0 kubenswrapper[15202]: I0319 09:44:13.810195 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-apiservice-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.821864 master-0 kubenswrapper[15202]: I0319 09:44:13.821811 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-metrics-cert\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.847490 master-0 kubenswrapper[15202]: I0319 09:44:13.847347 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp4j5\" (UniqueName: \"kubernetes.io/projected/5cb2f66e-1a46-44ee-b8d4-d42b323bef33-kube-api-access-wp4j5\") pod \"lvms-operator-c6dbd8b78-6p8rh\" (UID: \"5cb2f66e-1a46-44ee-b8d4-d42b323bef33\") " pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:13.928312 master-0 kubenswrapper[15202]: I0319 09:44:13.928250 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:14.330516 master-0 kubenswrapper[15202]: I0319 09:44:14.330449 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-c6dbd8b78-6p8rh"] Mar 19 09:44:14.336646 master-0 kubenswrapper[15202]: W0319 09:44:14.335871 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb2f66e_1a46_44ee_b8d4_d42b323bef33.slice/crio-59dbca996a856c347cb66285a1ae627525f5de637fb7e75c5624492ea878edd3 WatchSource:0}: Error finding container 59dbca996a856c347cb66285a1ae627525f5de637fb7e75c5624492ea878edd3: Status 404 returned error can't find the container with id 59dbca996a856c347cb66285a1ae627525f5de637fb7e75c5624492ea878edd3 Mar 19 09:44:14.339985 master-0 kubenswrapper[15202]: I0319 09:44:14.339943 15202 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:44:14.740185 master-0 kubenswrapper[15202]: I0319 09:44:14.740087 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" event={"ID":"5cb2f66e-1a46-44ee-b8d4-d42b323bef33","Type":"ContainerStarted","Data":"59dbca996a856c347cb66285a1ae627525f5de637fb7e75c5624492ea878edd3"} Mar 19 09:44:19.788988 master-0 kubenswrapper[15202]: I0319 09:44:19.788912 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" event={"ID":"5cb2f66e-1a46-44ee-b8d4-d42b323bef33","Type":"ContainerStarted","Data":"0864cab700f763453d9ac4f47428b9ce9707ebabcec923ba5ef71914d5cb754c"} Mar 19 09:44:19.789723 master-0 kubenswrapper[15202]: I0319 09:44:19.789680 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:19.818434 master-0 kubenswrapper[15202]: I0319 09:44:19.818310 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" podStartSLOduration=1.803916199 podStartE2EDuration="6.81828649s" podCreationTimestamp="2026-03-19 09:44:13 +0000 UTC" firstStartedPulling="2026-03-19 09:44:14.33978571 +0000 UTC m=+1171.725200526" lastFinishedPulling="2026-03-19 09:44:19.354156011 +0000 UTC m=+1176.739570817" observedRunningTime="2026-03-19 09:44:19.806501559 +0000 UTC m=+1177.191916415" watchObservedRunningTime="2026-03-19 09:44:19.81828649 +0000 UTC m=+1177.203701346" Mar 19 09:44:20.801058 master-0 kubenswrapper[15202]: I0319 09:44:20.800991 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-c6dbd8b78-6p8rh" Mar 19 09:44:24.889326 master-0 kubenswrapper[15202]: I0319 09:44:24.889188 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8"] Mar 19 09:44:24.894307 master-0 kubenswrapper[15202]: I0319 09:44:24.891551 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:24.897815 master-0 kubenswrapper[15202]: I0319 09:44:24.895599 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-2l79b" Mar 19 09:44:24.921452 master-0 kubenswrapper[15202]: I0319 09:44:24.921396 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:24.921818 master-0 kubenswrapper[15202]: I0319 09:44:24.921757 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:24.921929 master-0 kubenswrapper[15202]: I0319 09:44:24.921887 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbm9z\" (UniqueName: \"kubernetes.io/projected/835ad7d0-d887-479f-a987-f63d182abd0f-kube-api-access-rbm9z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:24.946217 master-0 kubenswrapper[15202]: I0319 09:44:24.946110 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8"] Mar 19 09:44:25.025103 master-0 kubenswrapper[15202]: I0319 09:44:25.025033 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.025103 master-0 kubenswrapper[15202]: I0319 09:44:25.025106 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbm9z\" (UniqueName: \"kubernetes.io/projected/835ad7d0-d887-479f-a987-f63d182abd0f-kube-api-access-rbm9z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.025386 master-0 kubenswrapper[15202]: I0319 09:44:25.025138 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.025658 master-0 kubenswrapper[15202]: I0319 09:44:25.025627 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.026275 master-0 kubenswrapper[15202]: I0319 09:44:25.026244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.162330 master-0 kubenswrapper[15202]: I0319 09:44:25.162209 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbm9z\" (UniqueName: \"kubernetes.io/projected/835ad7d0-d887-479f-a987-f63d182abd0f-kube-api-access-rbm9z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.179922 master-0 kubenswrapper[15202]: I0319 09:44:25.179841 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86"] Mar 19 09:44:25.181589 master-0 kubenswrapper[15202]: I0319 09:44:25.181553 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.199761 master-0 kubenswrapper[15202]: I0319 09:44:25.199675 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86"] Mar 19 09:44:25.235678 master-0 kubenswrapper[15202]: I0319 09:44:25.235608 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:25.328957 master-0 kubenswrapper[15202]: I0319 09:44:25.328862 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4nf\" (UniqueName: \"kubernetes.io/projected/6df45cef-c6b2-452d-a0e5-3b635c776815-kube-api-access-7b4nf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.329098 master-0 kubenswrapper[15202]: I0319 09:44:25.328968 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.329098 master-0 kubenswrapper[15202]: I0319 09:44:25.328997 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.431120 master-0 kubenswrapper[15202]: I0319 09:44:25.431029 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4nf\" (UniqueName: \"kubernetes.io/projected/6df45cef-c6b2-452d-a0e5-3b635c776815-kube-api-access-7b4nf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.431349 master-0 kubenswrapper[15202]: I0319 09:44:25.431147 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.431349 master-0 kubenswrapper[15202]: I0319 09:44:25.431188 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.432306 master-0 kubenswrapper[15202]: I0319 09:44:25.431939 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.432407 master-0 kubenswrapper[15202]: I0319 09:44:25.432323 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.452177 master-0 kubenswrapper[15202]: I0319 09:44:25.452102 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4nf\" (UniqueName: \"kubernetes.io/projected/6df45cef-c6b2-452d-a0e5-3b635c776815-kube-api-access-7b4nf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.506422 master-0 kubenswrapper[15202]: I0319 09:44:25.506330 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:25.765774 master-0 kubenswrapper[15202]: I0319 09:44:25.765631 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8"] Mar 19 09:44:25.784124 master-0 kubenswrapper[15202]: W0319 09:44:25.781943 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod835ad7d0_d887_479f_a987_f63d182abd0f.slice/crio-0feee884758463e2506f57543043a33348cf0eeb7db5e05607d3670f3f1ef8dc WatchSource:0}: Error finding container 0feee884758463e2506f57543043a33348cf0eeb7db5e05607d3670f3f1ef8dc: Status 404 returned error can't find the container with id 0feee884758463e2506f57543043a33348cf0eeb7db5e05607d3670f3f1ef8dc Mar 19 09:44:25.842595 master-0 kubenswrapper[15202]: I0319 09:44:25.842511 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" event={"ID":"835ad7d0-d887-479f-a987-f63d182abd0f","Type":"ContainerStarted","Data":"0feee884758463e2506f57543043a33348cf0eeb7db5e05607d3670f3f1ef8dc"} Mar 19 09:44:26.140495 master-0 kubenswrapper[15202]: I0319 09:44:26.140420 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86"] Mar 19 09:44:26.826247 master-0 kubenswrapper[15202]: I0319 09:44:26.826184 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr"] Mar 19 09:44:26.828439 master-0 kubenswrapper[15202]: I0319 09:44:26.828409 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:26.837299 master-0 kubenswrapper[15202]: I0319 09:44:26.837229 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr"] Mar 19 09:44:26.859567 master-0 kubenswrapper[15202]: I0319 09:44:26.859101 15202 generic.go:334] "Generic (PLEG): container finished" podID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerID="f90d50a836ba40850fc92f37088947b012341f4e0a85beb6aad8545c8f0268f7" exitCode=0 Mar 19 09:44:26.859567 master-0 kubenswrapper[15202]: I0319 09:44:26.859146 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" event={"ID":"6df45cef-c6b2-452d-a0e5-3b635c776815","Type":"ContainerDied","Data":"f90d50a836ba40850fc92f37088947b012341f4e0a85beb6aad8545c8f0268f7"} Mar 19 09:44:26.859567 master-0 kubenswrapper[15202]: I0319 09:44:26.859200 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" event={"ID":"6df45cef-c6b2-452d-a0e5-3b635c776815","Type":"ContainerStarted","Data":"40d311b7ea6096d11e4f72e62c4f6d63d4a64e3942a715fe748673a9ed5deec5"} Mar 19 09:44:26.862325 master-0 kubenswrapper[15202]: I0319 09:44:26.862288 15202 generic.go:334] "Generic (PLEG): container finished" podID="835ad7d0-d887-479f-a987-f63d182abd0f" containerID="5927ad492fe01007954b1e16ac309c8f1e6aa24575275e47dc13299904942fd4" exitCode=0 Mar 19 09:44:26.862410 master-0 kubenswrapper[15202]: I0319 09:44:26.862346 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" event={"ID":"835ad7d0-d887-479f-a987-f63d182abd0f","Type":"ContainerDied","Data":"5927ad492fe01007954b1e16ac309c8f1e6aa24575275e47dc13299904942fd4"} Mar 19 09:44:26.961043 master-0 kubenswrapper[15202]: I0319 09:44:26.960955 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzlk\" (UniqueName: \"kubernetes.io/projected/b0cf5e26-20c2-4793-bf18-53909bb0fce9-kube-api-access-ctzlk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:26.961043 master-0 kubenswrapper[15202]: I0319 09:44:26.961044 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:26.962239 master-0 kubenswrapper[15202]: I0319 09:44:26.962182 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.063728 master-0 kubenswrapper[15202]: I0319 09:44:27.063628 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.063938 master-0 kubenswrapper[15202]: I0319 09:44:27.063859 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.063938 master-0 kubenswrapper[15202]: I0319 09:44:27.063928 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzlk\" (UniqueName: \"kubernetes.io/projected/b0cf5e26-20c2-4793-bf18-53909bb0fce9-kube-api-access-ctzlk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.064204 master-0 kubenswrapper[15202]: I0319 09:44:27.064146 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.064630 master-0 kubenswrapper[15202]: I0319 09:44:27.064609 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.084203 master-0 kubenswrapper[15202]: I0319 09:44:27.084053 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzlk\" (UniqueName: \"kubernetes.io/projected/b0cf5e26-20c2-4793-bf18-53909bb0fce9-kube-api-access-ctzlk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.157853 master-0 kubenswrapper[15202]: I0319 09:44:27.157778 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:27.610669 master-0 kubenswrapper[15202]: I0319 09:44:27.610412 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr"] Mar 19 09:44:27.620106 master-0 kubenswrapper[15202]: W0319 09:44:27.620032 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0cf5e26_20c2_4793_bf18_53909bb0fce9.slice/crio-c3dcee530eddf370c591657ecb968977490553ea0a34de9e1313e014d1d4ba48 WatchSource:0}: Error finding container c3dcee530eddf370c591657ecb968977490553ea0a34de9e1313e014d1d4ba48: Status 404 returned error can't find the container with id c3dcee530eddf370c591657ecb968977490553ea0a34de9e1313e014d1d4ba48 Mar 19 09:44:27.875770 master-0 kubenswrapper[15202]: I0319 09:44:27.875177 15202 generic.go:334] "Generic (PLEG): container finished" podID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerID="4d04bb0d7c26e3ab746dcbcfd2f55c80ad60957240d3fbeeb631b0b1f50aba62" exitCode=0 Mar 19 09:44:27.875770 master-0 kubenswrapper[15202]: I0319 09:44:27.875239 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" event={"ID":"b0cf5e26-20c2-4793-bf18-53909bb0fce9","Type":"ContainerDied","Data":"4d04bb0d7c26e3ab746dcbcfd2f55c80ad60957240d3fbeeb631b0b1f50aba62"} Mar 19 09:44:27.875770 master-0 kubenswrapper[15202]: I0319 09:44:27.875288 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" event={"ID":"b0cf5e26-20c2-4793-bf18-53909bb0fce9","Type":"ContainerStarted","Data":"c3dcee530eddf370c591657ecb968977490553ea0a34de9e1313e014d1d4ba48"} Mar 19 09:44:28.887643 master-0 kubenswrapper[15202]: I0319 09:44:28.887581 15202 generic.go:334] "Generic (PLEG): container finished" podID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerID="7bc579ddcea792ba2372899e96b81166f9d1aa6f43a2b5d7600a9ee50f3f6290" exitCode=0 Mar 19 09:44:28.888197 master-0 kubenswrapper[15202]: I0319 09:44:28.887626 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" event={"ID":"6df45cef-c6b2-452d-a0e5-3b635c776815","Type":"ContainerDied","Data":"7bc579ddcea792ba2372899e96b81166f9d1aa6f43a2b5d7600a9ee50f3f6290"} Mar 19 09:44:31.915865 master-0 kubenswrapper[15202]: I0319 09:44:31.915428 15202 generic.go:334] "Generic (PLEG): container finished" podID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerID="93e5597e69bd1173df01d9adda20fec578f4cbbab72ac1c3ca002e03d9708742" exitCode=0 Mar 19 09:44:31.915865 master-0 kubenswrapper[15202]: I0319 09:44:31.915545 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" event={"ID":"6df45cef-c6b2-452d-a0e5-3b635c776815","Type":"ContainerDied","Data":"93e5597e69bd1173df01d9adda20fec578f4cbbab72ac1c3ca002e03d9708742"} Mar 19 09:44:31.917996 master-0 kubenswrapper[15202]: I0319 09:44:31.917957 15202 generic.go:334] "Generic (PLEG): container finished" podID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerID="de557df31dcaba493aedc548f1449130394ae27bde492d5434fbc8de156b9983" exitCode=0 Mar 19 09:44:31.918133 master-0 kubenswrapper[15202]: I0319 09:44:31.918075 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" event={"ID":"b0cf5e26-20c2-4793-bf18-53909bb0fce9","Type":"ContainerDied","Data":"de557df31dcaba493aedc548f1449130394ae27bde492d5434fbc8de156b9983"} Mar 19 09:44:31.922135 master-0 kubenswrapper[15202]: I0319 09:44:31.922018 15202 generic.go:334] "Generic (PLEG): container finished" podID="835ad7d0-d887-479f-a987-f63d182abd0f" containerID="47784ecf80f0b17c7dcd4dccfa385cc8b71ebbf9dec42b72093c67b100e5d79e" exitCode=0 Mar 19 09:44:31.922135 master-0 kubenswrapper[15202]: I0319 09:44:31.922102 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" event={"ID":"835ad7d0-d887-479f-a987-f63d182abd0f","Type":"ContainerDied","Data":"47784ecf80f0b17c7dcd4dccfa385cc8b71ebbf9dec42b72093c67b100e5d79e"} Mar 19 09:44:32.409657 master-0 kubenswrapper[15202]: I0319 09:44:32.409584 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf"] Mar 19 09:44:32.411905 master-0 kubenswrapper[15202]: I0319 09:44:32.411358 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.431738 master-0 kubenswrapper[15202]: I0319 09:44:32.431699 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf"] Mar 19 09:44:32.563663 master-0 kubenswrapper[15202]: I0319 09:44:32.563492 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2968x\" (UniqueName: \"kubernetes.io/projected/2bb4ae49-2df5-4944-8ddf-0da713459352-kube-api-access-2968x\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.563663 master-0 kubenswrapper[15202]: I0319 09:44:32.563573 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.563663 master-0 kubenswrapper[15202]: I0319 09:44:32.563601 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.665007 master-0 kubenswrapper[15202]: I0319 09:44:32.664951 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2968x\" (UniqueName: \"kubernetes.io/projected/2bb4ae49-2df5-4944-8ddf-0da713459352-kube-api-access-2968x\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.665288 master-0 kubenswrapper[15202]: I0319 09:44:32.665269 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.665423 master-0 kubenswrapper[15202]: I0319 09:44:32.665401 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.665883 master-0 kubenswrapper[15202]: I0319 09:44:32.665857 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.666066 master-0 kubenswrapper[15202]: I0319 09:44:32.665997 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.683537 master-0 kubenswrapper[15202]: I0319 09:44:32.683491 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2968x\" (UniqueName: \"kubernetes.io/projected/2bb4ae49-2df5-4944-8ddf-0da713459352-kube-api-access-2968x\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.732880 master-0 kubenswrapper[15202]: I0319 09:44:32.732803 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:32.932501 master-0 kubenswrapper[15202]: I0319 09:44:32.932352 15202 generic.go:334] "Generic (PLEG): container finished" podID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerID="a67cbdb3e49c06201158099d7fb8b3a76059059dde027f5feb4ad08bfc4cd32d" exitCode=0 Mar 19 09:44:32.932501 master-0 kubenswrapper[15202]: I0319 09:44:32.932410 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" event={"ID":"b0cf5e26-20c2-4793-bf18-53909bb0fce9","Type":"ContainerDied","Data":"a67cbdb3e49c06201158099d7fb8b3a76059059dde027f5feb4ad08bfc4cd32d"} Mar 19 09:44:32.934403 master-0 kubenswrapper[15202]: I0319 09:44:32.934348 15202 generic.go:334] "Generic (PLEG): container finished" podID="835ad7d0-d887-479f-a987-f63d182abd0f" containerID="6e24c8c062c59dcdb2e3ba014d60ad955954af5593d0088958b61f9325ea3dc0" exitCode=0 Mar 19 09:44:32.934519 master-0 kubenswrapper[15202]: I0319 09:44:32.934388 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" event={"ID":"835ad7d0-d887-479f-a987-f63d182abd0f","Type":"ContainerDied","Data":"6e24c8c062c59dcdb2e3ba014d60ad955954af5593d0088958b61f9325ea3dc0"} Mar 19 09:44:33.198356 master-0 kubenswrapper[15202]: I0319 09:44:33.198273 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf"] Mar 19 09:44:33.199771 master-0 kubenswrapper[15202]: W0319 09:44:33.199721 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb4ae49_2df5_4944_8ddf_0da713459352.slice/crio-1327a4c75451f969b824099d4366757a2a79b9d4a2cc0a3f8eaca7dee1c8099a WatchSource:0}: Error finding container 1327a4c75451f969b824099d4366757a2a79b9d4a2cc0a3f8eaca7dee1c8099a: Status 404 returned error can't find the container with id 1327a4c75451f969b824099d4366757a2a79b9d4a2cc0a3f8eaca7dee1c8099a Mar 19 09:44:33.235558 master-0 kubenswrapper[15202]: I0319 09:44:33.232593 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:33.378080 master-0 kubenswrapper[15202]: I0319 09:44:33.378020 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-bundle\") pod \"6df45cef-c6b2-452d-a0e5-3b635c776815\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " Mar 19 09:44:33.378431 master-0 kubenswrapper[15202]: I0319 09:44:33.378229 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4nf\" (UniqueName: \"kubernetes.io/projected/6df45cef-c6b2-452d-a0e5-3b635c776815-kube-api-access-7b4nf\") pod \"6df45cef-c6b2-452d-a0e5-3b635c776815\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " Mar 19 09:44:33.378431 master-0 kubenswrapper[15202]: I0319 09:44:33.378346 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-util\") pod \"6df45cef-c6b2-452d-a0e5-3b635c776815\" (UID: \"6df45cef-c6b2-452d-a0e5-3b635c776815\") " Mar 19 09:44:33.379835 master-0 kubenswrapper[15202]: I0319 09:44:33.379784 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-bundle" (OuterVolumeSpecName: "bundle") pod "6df45cef-c6b2-452d-a0e5-3b635c776815" (UID: "6df45cef-c6b2-452d-a0e5-3b635c776815"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:33.382039 master-0 kubenswrapper[15202]: I0319 09:44:33.381780 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df45cef-c6b2-452d-a0e5-3b635c776815-kube-api-access-7b4nf" (OuterVolumeSpecName: "kube-api-access-7b4nf") pod "6df45cef-c6b2-452d-a0e5-3b635c776815" (UID: "6df45cef-c6b2-452d-a0e5-3b635c776815"). InnerVolumeSpecName "kube-api-access-7b4nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:44:33.394403 master-0 kubenswrapper[15202]: I0319 09:44:33.393608 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-util" (OuterVolumeSpecName: "util") pod "6df45cef-c6b2-452d-a0e5-3b635c776815" (UID: "6df45cef-c6b2-452d-a0e5-3b635c776815"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:33.480769 master-0 kubenswrapper[15202]: I0319 09:44:33.480721 15202 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:33.480769 master-0 kubenswrapper[15202]: I0319 09:44:33.480760 15202 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6df45cef-c6b2-452d-a0e5-3b635c776815-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:33.480769 master-0 kubenswrapper[15202]: I0319 09:44:33.480773 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4nf\" (UniqueName: \"kubernetes.io/projected/6df45cef-c6b2-452d-a0e5-3b635c776815-kube-api-access-7b4nf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:33.947089 master-0 kubenswrapper[15202]: I0319 09:44:33.946984 15202 generic.go:334] "Generic (PLEG): container finished" podID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerID="fdad3e9659770722cc9fc90f1b88f109e481f87c69dd9bc7b501582978a65b2a" exitCode=0 Mar 19 09:44:33.947089 master-0 kubenswrapper[15202]: I0319 09:44:33.947032 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" event={"ID":"2bb4ae49-2df5-4944-8ddf-0da713459352","Type":"ContainerDied","Data":"fdad3e9659770722cc9fc90f1b88f109e481f87c69dd9bc7b501582978a65b2a"} Mar 19 09:44:33.948184 master-0 kubenswrapper[15202]: I0319 09:44:33.947113 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" event={"ID":"2bb4ae49-2df5-4944-8ddf-0da713459352","Type":"ContainerStarted","Data":"1327a4c75451f969b824099d4366757a2a79b9d4a2cc0a3f8eaca7dee1c8099a"} Mar 19 09:44:33.952150 master-0 kubenswrapper[15202]: I0319 09:44:33.951666 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" event={"ID":"6df45cef-c6b2-452d-a0e5-3b635c776815","Type":"ContainerDied","Data":"40d311b7ea6096d11e4f72e62c4f6d63d4a64e3942a715fe748673a9ed5deec5"} Mar 19 09:44:33.952150 master-0 kubenswrapper[15202]: I0319 09:44:33.951726 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d311b7ea6096d11e4f72e62c4f6d63d4a64e3942a715fe748673a9ed5deec5" Mar 19 09:44:33.952150 master-0 kubenswrapper[15202]: I0319 09:44:33.951892 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c159x86" Mar 19 09:44:34.378987 master-0 kubenswrapper[15202]: I0319 09:44:34.378916 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:34.438352 master-0 kubenswrapper[15202]: I0319 09:44:34.438301 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:34.503231 master-0 kubenswrapper[15202]: I0319 09:44:34.503171 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctzlk\" (UniqueName: \"kubernetes.io/projected/b0cf5e26-20c2-4793-bf18-53909bb0fce9-kube-api-access-ctzlk\") pod \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " Mar 19 09:44:34.503231 master-0 kubenswrapper[15202]: I0319 09:44:34.503228 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-util\") pod \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " Mar 19 09:44:34.503511 master-0 kubenswrapper[15202]: I0319 09:44:34.503344 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-bundle\") pod \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\" (UID: \"b0cf5e26-20c2-4793-bf18-53909bb0fce9\") " Mar 19 09:44:34.504112 master-0 kubenswrapper[15202]: I0319 09:44:34.504066 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-bundle" (OuterVolumeSpecName: "bundle") pod "b0cf5e26-20c2-4793-bf18-53909bb0fce9" (UID: "b0cf5e26-20c2-4793-bf18-53909bb0fce9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:34.507692 master-0 kubenswrapper[15202]: I0319 09:44:34.507670 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cf5e26-20c2-4793-bf18-53909bb0fce9-kube-api-access-ctzlk" (OuterVolumeSpecName: "kube-api-access-ctzlk") pod "b0cf5e26-20c2-4793-bf18-53909bb0fce9" (UID: "b0cf5e26-20c2-4793-bf18-53909bb0fce9"). InnerVolumeSpecName "kube-api-access-ctzlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:44:34.514816 master-0 kubenswrapper[15202]: I0319 09:44:34.513654 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-util" (OuterVolumeSpecName: "util") pod "b0cf5e26-20c2-4793-bf18-53909bb0fce9" (UID: "b0cf5e26-20c2-4793-bf18-53909bb0fce9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:34.604345 master-0 kubenswrapper[15202]: I0319 09:44:34.604229 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-bundle\") pod \"835ad7d0-d887-479f-a987-f63d182abd0f\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " Mar 19 09:44:34.604815 master-0 kubenswrapper[15202]: I0319 09:44:34.604795 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-util\") pod \"835ad7d0-d887-479f-a987-f63d182abd0f\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " Mar 19 09:44:34.604970 master-0 kubenswrapper[15202]: I0319 09:44:34.604953 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbm9z\" (UniqueName: \"kubernetes.io/projected/835ad7d0-d887-479f-a987-f63d182abd0f-kube-api-access-rbm9z\") pod \"835ad7d0-d887-479f-a987-f63d182abd0f\" (UID: \"835ad7d0-d887-479f-a987-f63d182abd0f\") " Mar 19 09:44:34.605370 master-0 kubenswrapper[15202]: I0319 09:44:34.605352 15202 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:34.605460 master-0 kubenswrapper[15202]: I0319 09:44:34.605428 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctzlk\" (UniqueName: \"kubernetes.io/projected/b0cf5e26-20c2-4793-bf18-53909bb0fce9-kube-api-access-ctzlk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:34.605611 master-0 kubenswrapper[15202]: I0319 09:44:34.605597 15202 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0cf5e26-20c2-4793-bf18-53909bb0fce9-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:34.606157 master-0 kubenswrapper[15202]: I0319 09:44:34.606049 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-bundle" (OuterVolumeSpecName: "bundle") pod "835ad7d0-d887-479f-a987-f63d182abd0f" (UID: "835ad7d0-d887-479f-a987-f63d182abd0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:34.608505 master-0 kubenswrapper[15202]: I0319 09:44:34.608384 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835ad7d0-d887-479f-a987-f63d182abd0f-kube-api-access-rbm9z" (OuterVolumeSpecName: "kube-api-access-rbm9z") pod "835ad7d0-d887-479f-a987-f63d182abd0f" (UID: "835ad7d0-d887-479f-a987-f63d182abd0f"). InnerVolumeSpecName "kube-api-access-rbm9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:44:34.620568 master-0 kubenswrapper[15202]: I0319 09:44:34.620420 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-util" (OuterVolumeSpecName: "util") pod "835ad7d0-d887-479f-a987-f63d182abd0f" (UID: "835ad7d0-d887-479f-a987-f63d182abd0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:34.706973 master-0 kubenswrapper[15202]: I0319 09:44:34.706888 15202 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:34.706973 master-0 kubenswrapper[15202]: I0319 09:44:34.706945 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbm9z\" (UniqueName: \"kubernetes.io/projected/835ad7d0-d887-479f-a987-f63d182abd0f-kube-api-access-rbm9z\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:34.706973 master-0 kubenswrapper[15202]: I0319 09:44:34.706956 15202 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/835ad7d0-d887-479f-a987-f63d182abd0f-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:34.963812 master-0 kubenswrapper[15202]: I0319 09:44:34.963722 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" event={"ID":"835ad7d0-d887-479f-a987-f63d182abd0f","Type":"ContainerDied","Data":"0feee884758463e2506f57543043a33348cf0eeb7db5e05607d3670f3f1ef8dc"} Mar 19 09:44:34.963812 master-0 kubenswrapper[15202]: I0319 09:44:34.963764 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0feee884758463e2506f57543043a33348cf0eeb7db5e05607d3670f3f1ef8dc" Mar 19 09:44:34.965031 master-0 kubenswrapper[15202]: I0319 09:44:34.964615 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5n72d8" Mar 19 09:44:34.968642 master-0 kubenswrapper[15202]: I0319 09:44:34.968588 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" event={"ID":"b0cf5e26-20c2-4793-bf18-53909bb0fce9","Type":"ContainerDied","Data":"c3dcee530eddf370c591657ecb968977490553ea0a34de9e1313e014d1d4ba48"} Mar 19 09:44:34.969239 master-0 kubenswrapper[15202]: I0319 09:44:34.968767 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3dcee530eddf370c591657ecb968977490553ea0a34de9e1313e014d1d4ba48" Mar 19 09:44:34.969239 master-0 kubenswrapper[15202]: I0319 09:44:34.968656 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874c2wdr" Mar 19 09:44:35.980032 master-0 kubenswrapper[15202]: I0319 09:44:35.979947 15202 generic.go:334] "Generic (PLEG): container finished" podID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerID="3ce98e67f998250ad11faa532ef0b2ed95935afaa59827039ba7f8d27cdb7859" exitCode=0 Mar 19 09:44:35.980032 master-0 kubenswrapper[15202]: I0319 09:44:35.980018 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" event={"ID":"2bb4ae49-2df5-4944-8ddf-0da713459352","Type":"ContainerDied","Data":"3ce98e67f998250ad11faa532ef0b2ed95935afaa59827039ba7f8d27cdb7859"} Mar 19 09:44:36.992614 master-0 kubenswrapper[15202]: I0319 09:44:36.992351 15202 generic.go:334] "Generic (PLEG): container finished" podID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerID="507db04fae68e0f15f76ef4d7e5367d9a0d8246bebd4b6687dd6453a370124f0" exitCode=0 Mar 19 09:44:36.992614 master-0 kubenswrapper[15202]: I0319 09:44:36.992418 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" event={"ID":"2bb4ae49-2df5-4944-8ddf-0da713459352","Type":"ContainerDied","Data":"507db04fae68e0f15f76ef4d7e5367d9a0d8246bebd4b6687dd6453a370124f0"} Mar 19 09:44:38.310985 master-0 kubenswrapper[15202]: I0319 09:44:38.310924 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:38.465902 master-0 kubenswrapper[15202]: I0319 09:44:38.465822 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-util\") pod \"2bb4ae49-2df5-4944-8ddf-0da713459352\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " Mar 19 09:44:38.466143 master-0 kubenswrapper[15202]: I0319 09:44:38.465951 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2968x\" (UniqueName: \"kubernetes.io/projected/2bb4ae49-2df5-4944-8ddf-0da713459352-kube-api-access-2968x\") pod \"2bb4ae49-2df5-4944-8ddf-0da713459352\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " Mar 19 09:44:38.466860 master-0 kubenswrapper[15202]: I0319 09:44:38.466807 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-bundle\") pod \"2bb4ae49-2df5-4944-8ddf-0da713459352\" (UID: \"2bb4ae49-2df5-4944-8ddf-0da713459352\") " Mar 19 09:44:38.468563 master-0 kubenswrapper[15202]: I0319 09:44:38.468527 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb4ae49-2df5-4944-8ddf-0da713459352-kube-api-access-2968x" (OuterVolumeSpecName: "kube-api-access-2968x") pod "2bb4ae49-2df5-4944-8ddf-0da713459352" (UID: "2bb4ae49-2df5-4944-8ddf-0da713459352"). InnerVolumeSpecName "kube-api-access-2968x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:44:38.470308 master-0 kubenswrapper[15202]: I0319 09:44:38.470267 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-bundle" (OuterVolumeSpecName: "bundle") pod "2bb4ae49-2df5-4944-8ddf-0da713459352" (UID: "2bb4ae49-2df5-4944-8ddf-0da713459352"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:38.481160 master-0 kubenswrapper[15202]: I0319 09:44:38.481086 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-util" (OuterVolumeSpecName: "util") pod "2bb4ae49-2df5-4944-8ddf-0da713459352" (UID: "2bb4ae49-2df5-4944-8ddf-0da713459352"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:44:38.569690 master-0 kubenswrapper[15202]: I0319 09:44:38.569544 15202 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:38.569690 master-0 kubenswrapper[15202]: I0319 09:44:38.569598 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2968x\" (UniqueName: \"kubernetes.io/projected/2bb4ae49-2df5-4944-8ddf-0da713459352-kube-api-access-2968x\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:38.569690 master-0 kubenswrapper[15202]: I0319 09:44:38.569616 15202 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bb4ae49-2df5-4944-8ddf-0da713459352-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:44:39.018456 master-0 kubenswrapper[15202]: I0319 09:44:39.018391 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" event={"ID":"2bb4ae49-2df5-4944-8ddf-0da713459352","Type":"ContainerDied","Data":"1327a4c75451f969b824099d4366757a2a79b9d4a2cc0a3f8eaca7dee1c8099a"} Mar 19 09:44:39.018688 master-0 kubenswrapper[15202]: I0319 09:44:39.018500 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1327a4c75451f969b824099d4366757a2a79b9d4a2cc0a3f8eaca7dee1c8099a" Mar 19 09:44:39.018688 master-0 kubenswrapper[15202]: I0319 09:44:39.018432 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726thnjf" Mar 19 09:44:46.133145 master-0 kubenswrapper[15202]: I0319 09:44:46.133102 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz"] Mar 19 09:44:46.135599 master-0 kubenswrapper[15202]: E0319 09:44:46.135578 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="util" Mar 19 09:44:46.151602 master-0 kubenswrapper[15202]: I0319 09:44:46.151544 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="util" Mar 19 09:44:46.151877 master-0 kubenswrapper[15202]: E0319 09:44:46.151861 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="extract" Mar 19 09:44:46.151955 master-0 kubenswrapper[15202]: I0319 09:44:46.151942 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="extract" Mar 19 09:44:46.152061 master-0 kubenswrapper[15202]: E0319 09:44:46.152050 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="extract" Mar 19 09:44:46.152122 master-0 kubenswrapper[15202]: I0319 09:44:46.152111 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="extract" Mar 19 09:44:46.152197 master-0 kubenswrapper[15202]: E0319 09:44:46.152187 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="pull" Mar 19 09:44:46.152253 master-0 kubenswrapper[15202]: I0319 09:44:46.152244 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="pull" Mar 19 09:44:46.152313 master-0 kubenswrapper[15202]: E0319 09:44:46.152304 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="util" Mar 19 09:44:46.152369 master-0 kubenswrapper[15202]: I0319 09:44:46.152360 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="util" Mar 19 09:44:46.152437 master-0 kubenswrapper[15202]: E0319 09:44:46.152427 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="extract" Mar 19 09:44:46.152513 master-0 kubenswrapper[15202]: I0319 09:44:46.152503 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="extract" Mar 19 09:44:46.152605 master-0 kubenswrapper[15202]: E0319 09:44:46.152593 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="pull" Mar 19 09:44:46.152700 master-0 kubenswrapper[15202]: I0319 09:44:46.152684 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="pull" Mar 19 09:44:46.152777 master-0 kubenswrapper[15202]: E0319 09:44:46.152766 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="util" Mar 19 09:44:46.152836 master-0 kubenswrapper[15202]: I0319 09:44:46.152826 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="util" Mar 19 09:44:46.152930 master-0 kubenswrapper[15202]: E0319 09:44:46.152919 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="pull" Mar 19 09:44:46.152990 master-0 kubenswrapper[15202]: I0319 09:44:46.152981 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="pull" Mar 19 09:44:46.153054 master-0 kubenswrapper[15202]: E0319 09:44:46.153044 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="extract" Mar 19 09:44:46.153115 master-0 kubenswrapper[15202]: I0319 09:44:46.153104 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="extract" Mar 19 09:44:46.153175 master-0 kubenswrapper[15202]: E0319 09:44:46.153165 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="pull" Mar 19 09:44:46.153233 master-0 kubenswrapper[15202]: I0319 09:44:46.153224 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="pull" Mar 19 09:44:46.153296 master-0 kubenswrapper[15202]: E0319 09:44:46.153286 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="util" Mar 19 09:44:46.153352 master-0 kubenswrapper[15202]: I0319 09:44:46.153343 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="util" Mar 19 09:44:46.153718 master-0 kubenswrapper[15202]: I0319 09:44:46.153699 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="835ad7d0-d887-479f-a987-f63d182abd0f" containerName="extract" Mar 19 09:44:46.153824 master-0 kubenswrapper[15202]: I0319 09:44:46.153813 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cf5e26-20c2-4793-bf18-53909bb0fce9" containerName="extract" Mar 19 09:44:46.153903 master-0 kubenswrapper[15202]: I0319 09:44:46.153892 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb4ae49-2df5-4944-8ddf-0da713459352" containerName="extract" Mar 19 09:44:46.153961 master-0 kubenswrapper[15202]: I0319 09:44:46.153952 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df45cef-c6b2-452d-a0e5-3b635c776815" containerName="extract" Mar 19 09:44:46.156145 master-0 kubenswrapper[15202]: I0319 09:44:46.156125 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" Mar 19 09:44:46.161500 master-0 kubenswrapper[15202]: I0319 09:44:46.160344 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz"] Mar 19 09:44:46.162198 master-0 kubenswrapper[15202]: I0319 09:44:46.162176 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 09:44:46.162457 master-0 kubenswrapper[15202]: I0319 09:44:46.162443 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 09:44:46.203898 master-0 kubenswrapper[15202]: I0319 09:44:46.202485 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwwvl\" (UniqueName: \"kubernetes.io/projected/628c6cfa-b09e-4a74-a152-20da732dd6db-kube-api-access-gwwvl\") pod \"nmstate-operator-796d4cfff4-h6jnz\" (UID: \"628c6cfa-b09e-4a74-a152-20da732dd6db\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" Mar 19 09:44:46.308654 master-0 kubenswrapper[15202]: I0319 09:44:46.308041 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwwvl\" (UniqueName: \"kubernetes.io/projected/628c6cfa-b09e-4a74-a152-20da732dd6db-kube-api-access-gwwvl\") pod \"nmstate-operator-796d4cfff4-h6jnz\" (UID: \"628c6cfa-b09e-4a74-a152-20da732dd6db\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" Mar 19 09:44:46.325196 master-0 kubenswrapper[15202]: I0319 09:44:46.325139 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwwvl\" (UniqueName: \"kubernetes.io/projected/628c6cfa-b09e-4a74-a152-20da732dd6db-kube-api-access-gwwvl\") pod \"nmstate-operator-796d4cfff4-h6jnz\" (UID: \"628c6cfa-b09e-4a74-a152-20da732dd6db\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" Mar 19 09:44:46.486964 master-0 kubenswrapper[15202]: I0319 09:44:46.486892 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" Mar 19 09:44:46.937392 master-0 kubenswrapper[15202]: W0319 09:44:46.937326 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628c6cfa_b09e_4a74_a152_20da732dd6db.slice/crio-20bb354e5abccff5044fe8e0871cd6ba5836b22c4fbe483d1d18116acb539166 WatchSource:0}: Error finding container 20bb354e5abccff5044fe8e0871cd6ba5836b22c4fbe483d1d18116acb539166: Status 404 returned error can't find the container with id 20bb354e5abccff5044fe8e0871cd6ba5836b22c4fbe483d1d18116acb539166 Mar 19 09:44:46.937933 master-0 kubenswrapper[15202]: I0319 09:44:46.937887 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz"] Mar 19 09:44:47.080175 master-0 kubenswrapper[15202]: I0319 09:44:47.080048 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" event={"ID":"628c6cfa-b09e-4a74-a152-20da732dd6db","Type":"ContainerStarted","Data":"20bb354e5abccff5044fe8e0871cd6ba5836b22c4fbe483d1d18116acb539166"} Mar 19 09:44:51.121322 master-0 kubenswrapper[15202]: I0319 09:44:51.121186 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" event={"ID":"628c6cfa-b09e-4a74-a152-20da732dd6db","Type":"ContainerStarted","Data":"57e6eb28e42c6c0f13f69c7655760965dc196b83e1ca5d20b01cbf11cf41fc59"} Mar 19 09:44:51.186497 master-0 kubenswrapper[15202]: I0319 09:44:51.184696 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274"] Mar 19 09:44:51.189501 master-0 kubenswrapper[15202]: I0319 09:44:51.187337 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.192487 master-0 kubenswrapper[15202]: I0319 09:44:51.192419 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 09:44:51.192566 master-0 kubenswrapper[15202]: I0319 09:44:51.192556 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 09:44:51.196492 master-0 kubenswrapper[15202]: I0319 09:44:51.195901 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h6jnz" podStartSLOduration=1.448196792 podStartE2EDuration="5.195880643s" podCreationTimestamp="2026-03-19 09:44:46 +0000 UTC" firstStartedPulling="2026-03-19 09:44:46.93966823 +0000 UTC m=+1204.325083046" lastFinishedPulling="2026-03-19 09:44:50.687352081 +0000 UTC m=+1208.072766897" observedRunningTime="2026-03-19 09:44:51.166867688 +0000 UTC m=+1208.552282514" watchObservedRunningTime="2026-03-19 09:44:51.195880643 +0000 UTC m=+1208.581295459" Mar 19 09:44:51.199174 master-0 kubenswrapper[15202]: I0319 09:44:51.199122 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 09:44:51.199366 master-0 kubenswrapper[15202]: I0319 09:44:51.199324 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 09:44:51.218487 master-0 kubenswrapper[15202]: I0319 09:44:51.213765 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274"] Mar 19 09:44:51.320608 master-0 kubenswrapper[15202]: I0319 09:44:51.320530 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52d0ba98-8db8-45ec-b212-8bec41dac138-apiservice-cert\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.320845 master-0 kubenswrapper[15202]: I0319 09:44:51.320694 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52d0ba98-8db8-45ec-b212-8bec41dac138-webhook-cert\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.320845 master-0 kubenswrapper[15202]: I0319 09:44:51.320770 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd4p9\" (UniqueName: \"kubernetes.io/projected/52d0ba98-8db8-45ec-b212-8bec41dac138-kube-api-access-xd4p9\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.847422 master-0 kubenswrapper[15202]: I0319 09:44:51.847331 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52d0ba98-8db8-45ec-b212-8bec41dac138-apiservice-cert\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.847970 master-0 kubenswrapper[15202]: I0319 09:44:51.847925 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52d0ba98-8db8-45ec-b212-8bec41dac138-webhook-cert\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.848176 master-0 kubenswrapper[15202]: I0319 09:44:51.848140 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd4p9\" (UniqueName: \"kubernetes.io/projected/52d0ba98-8db8-45ec-b212-8bec41dac138-kube-api-access-xd4p9\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.852897 master-0 kubenswrapper[15202]: I0319 09:44:51.852852 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52d0ba98-8db8-45ec-b212-8bec41dac138-apiservice-cert\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.855802 master-0 kubenswrapper[15202]: I0319 09:44:51.855765 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52d0ba98-8db8-45ec-b212-8bec41dac138-webhook-cert\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:51.909308 master-0 kubenswrapper[15202]: I0319 09:44:51.905618 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd4p9\" (UniqueName: \"kubernetes.io/projected/52d0ba98-8db8-45ec-b212-8bec41dac138-kube-api-access-xd4p9\") pod \"metallb-operator-controller-manager-6d7b76b756-hw274\" (UID: \"52d0ba98-8db8-45ec-b212-8bec41dac138\") " pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:52.033241 master-0 kubenswrapper[15202]: I0319 09:44:52.033175 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2"] Mar 19 09:44:52.034376 master-0 kubenswrapper[15202]: I0319 09:44:52.034338 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.036806 master-0 kubenswrapper[15202]: I0319 09:44:52.036782 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 09:44:52.041416 master-0 kubenswrapper[15202]: I0319 09:44:52.041383 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 09:44:52.053767 master-0 kubenswrapper[15202]: I0319 09:44:52.053704 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87dc574f-d263-420b-9029-edc87ea6c142-apiservice-cert\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.054059 master-0 kubenswrapper[15202]: I0319 09:44:52.054035 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87dc574f-d263-420b-9029-edc87ea6c142-webhook-cert\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.054238 master-0 kubenswrapper[15202]: I0319 09:44:52.054218 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghz64\" (UniqueName: \"kubernetes.io/projected/87dc574f-d263-420b-9029-edc87ea6c142-kube-api-access-ghz64\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.069165 master-0 kubenswrapper[15202]: I0319 09:44:52.065591 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2"] Mar 19 09:44:52.159629 master-0 kubenswrapper[15202]: I0319 09:44:52.155769 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghz64\" (UniqueName: \"kubernetes.io/projected/87dc574f-d263-420b-9029-edc87ea6c142-kube-api-access-ghz64\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.159629 master-0 kubenswrapper[15202]: I0319 09:44:52.155880 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87dc574f-d263-420b-9029-edc87ea6c142-apiservice-cert\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.159629 master-0 kubenswrapper[15202]: I0319 09:44:52.155915 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87dc574f-d263-420b-9029-edc87ea6c142-webhook-cert\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.160788 master-0 kubenswrapper[15202]: I0319 09:44:52.160740 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87dc574f-d263-420b-9029-edc87ea6c142-webhook-cert\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.161351 master-0 kubenswrapper[15202]: I0319 09:44:52.161308 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87dc574f-d263-420b-9029-edc87ea6c142-apiservice-cert\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.175031 master-0 kubenswrapper[15202]: I0319 09:44:52.174975 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:44:52.177629 master-0 kubenswrapper[15202]: I0319 09:44:52.177573 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghz64\" (UniqueName: \"kubernetes.io/projected/87dc574f-d263-420b-9029-edc87ea6c142-kube-api-access-ghz64\") pod \"metallb-operator-webhook-server-754b74fdf5-vvbj2\" (UID: \"87dc574f-d263-420b-9029-edc87ea6c142\") " pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.351157 master-0 kubenswrapper[15202]: I0319 09:44:52.351098 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:44:52.537935 master-0 kubenswrapper[15202]: I0319 09:44:52.537873 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8"] Mar 19 09:44:52.541978 master-0 kubenswrapper[15202]: I0319 09:44:52.541933 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.544700 master-0 kubenswrapper[15202]: I0319 09:44:52.544534 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 19 09:44:52.546716 master-0 kubenswrapper[15202]: I0319 09:44:52.546627 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 19 09:44:52.578852 master-0 kubenswrapper[15202]: I0319 09:44:52.578798 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a879900b-5a61-443e-bf19-609331de69c6-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-mzwh8\" (UID: \"a879900b-5a61-443e-bf19-609331de69c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.580024 master-0 kubenswrapper[15202]: I0319 09:44:52.579937 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gprc6\" (UniqueName: \"kubernetes.io/projected/a879900b-5a61-443e-bf19-609331de69c6-kube-api-access-gprc6\") pod \"cert-manager-operator-controller-manager-66c8bdd694-mzwh8\" (UID: \"a879900b-5a61-443e-bf19-609331de69c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.585224 master-0 kubenswrapper[15202]: I0319 09:44:52.585184 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8"] Mar 19 09:44:52.682067 master-0 kubenswrapper[15202]: I0319 09:44:52.681717 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gprc6\" (UniqueName: \"kubernetes.io/projected/a879900b-5a61-443e-bf19-609331de69c6-kube-api-access-gprc6\") pod \"cert-manager-operator-controller-manager-66c8bdd694-mzwh8\" (UID: \"a879900b-5a61-443e-bf19-609331de69c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.683548 master-0 kubenswrapper[15202]: I0319 09:44:52.682279 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a879900b-5a61-443e-bf19-609331de69c6-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-mzwh8\" (UID: \"a879900b-5a61-443e-bf19-609331de69c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.683548 master-0 kubenswrapper[15202]: I0319 09:44:52.682923 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a879900b-5a61-443e-bf19-609331de69c6-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-mzwh8\" (UID: \"a879900b-5a61-443e-bf19-609331de69c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.707597 master-0 kubenswrapper[15202]: I0319 09:44:52.707549 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gprc6\" (UniqueName: \"kubernetes.io/projected/a879900b-5a61-443e-bf19-609331de69c6-kube-api-access-gprc6\") pod \"cert-manager-operator-controller-manager-66c8bdd694-mzwh8\" (UID: \"a879900b-5a61-443e-bf19-609331de69c6\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.832571 master-0 kubenswrapper[15202]: I0319 09:44:52.832523 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274"] Mar 19 09:44:52.834716 master-0 kubenswrapper[15202]: W0319 09:44:52.834653 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d0ba98_8db8_45ec_b212_8bec41dac138.slice/crio-1b94518480da91cfc4296f795cd1f7d17f78318117a45824546f5917ab6f7ed6 WatchSource:0}: Error finding container 1b94518480da91cfc4296f795cd1f7d17f78318117a45824546f5917ab6f7ed6: Status 404 returned error can't find the container with id 1b94518480da91cfc4296f795cd1f7d17f78318117a45824546f5917ab6f7ed6 Mar 19 09:44:52.859276 master-0 kubenswrapper[15202]: I0319 09:44:52.859226 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" Mar 19 09:44:52.939456 master-0 kubenswrapper[15202]: I0319 09:44:52.939345 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2"] Mar 19 09:44:52.946226 master-0 kubenswrapper[15202]: W0319 09:44:52.946173 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87dc574f_d263_420b_9029_edc87ea6c142.slice/crio-6c2a1c5b7fc54ef2d54bcbbbdcd52f1ad9d08c09bedf4822fd2810802dc1ea4c WatchSource:0}: Error finding container 6c2a1c5b7fc54ef2d54bcbbbdcd52f1ad9d08c09bedf4822fd2810802dc1ea4c: Status 404 returned error can't find the container with id 6c2a1c5b7fc54ef2d54bcbbbdcd52f1ad9d08c09bedf4822fd2810802dc1ea4c Mar 19 09:44:53.138449 master-0 kubenswrapper[15202]: I0319 09:44:53.138379 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" event={"ID":"52d0ba98-8db8-45ec-b212-8bec41dac138","Type":"ContainerStarted","Data":"1b94518480da91cfc4296f795cd1f7d17f78318117a45824546f5917ab6f7ed6"} Mar 19 09:44:53.140180 master-0 kubenswrapper[15202]: I0319 09:44:53.140120 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" event={"ID":"87dc574f-d263-420b-9029-edc87ea6c142","Type":"ContainerStarted","Data":"6c2a1c5b7fc54ef2d54bcbbbdcd52f1ad9d08c09bedf4822fd2810802dc1ea4c"} Mar 19 09:44:53.283660 master-0 kubenswrapper[15202]: I0319 09:44:53.281176 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8"] Mar 19 09:44:53.298293 master-0 kubenswrapper[15202]: W0319 09:44:53.294697 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda879900b_5a61_443e_bf19_609331de69c6.slice/crio-fc2a7370cbc50bdb05f90d0443337ed8901ab78edd69e5862cd0bb5aaef66158 WatchSource:0}: Error finding container fc2a7370cbc50bdb05f90d0443337ed8901ab78edd69e5862cd0bb5aaef66158: Status 404 returned error can't find the container with id fc2a7370cbc50bdb05f90d0443337ed8901ab78edd69e5862cd0bb5aaef66158 Mar 19 09:44:54.150069 master-0 kubenswrapper[15202]: I0319 09:44:54.149995 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" event={"ID":"a879900b-5a61-443e-bf19-609331de69c6","Type":"ContainerStarted","Data":"fc2a7370cbc50bdb05f90d0443337ed8901ab78edd69e5862cd0bb5aaef66158"} Mar 19 09:45:03.349392 master-0 kubenswrapper[15202]: I0319 09:45:03.349273 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" event={"ID":"87dc574f-d263-420b-9029-edc87ea6c142","Type":"ContainerStarted","Data":"7ab9a7319998f7567364abc00e16520b0b754627d9227da41a9dd4f55f924855"} Mar 19 09:45:03.350985 master-0 kubenswrapper[15202]: I0319 09:45:03.350964 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:45:03.360693 master-0 kubenswrapper[15202]: I0319 09:45:03.360630 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" event={"ID":"52d0ba98-8db8-45ec-b212-8bec41dac138","Type":"ContainerStarted","Data":"8c9f49901e76e9bba42b3b354b9c392eab2657a861fc1f1741bd8b1c18a83de5"} Mar 19 09:45:03.360873 master-0 kubenswrapper[15202]: I0319 09:45:03.360755 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:45:03.381492 master-0 kubenswrapper[15202]: I0319 09:45:03.379597 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" event={"ID":"a879900b-5a61-443e-bf19-609331de69c6","Type":"ContainerStarted","Data":"ea819f1358d49d62cd0faf87f1274cc1004fd0fc29aa45984428221086304e1b"} Mar 19 09:45:03.502974 master-0 kubenswrapper[15202]: I0319 09:45:03.502875 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" podStartSLOduration=1.482887157 podStartE2EDuration="11.502856764s" podCreationTimestamp="2026-03-19 09:44:52 +0000 UTC" firstStartedPulling="2026-03-19 09:44:52.955935712 +0000 UTC m=+1210.341350528" lastFinishedPulling="2026-03-19 09:45:02.975905309 +0000 UTC m=+1220.361320135" observedRunningTime="2026-03-19 09:45:03.494485988 +0000 UTC m=+1220.879900814" watchObservedRunningTime="2026-03-19 09:45:03.502856764 +0000 UTC m=+1220.888271590" Mar 19 09:45:03.562242 master-0 kubenswrapper[15202]: I0319 09:45:03.562167 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" podStartSLOduration=2.516132825 podStartE2EDuration="12.562149583s" podCreationTimestamp="2026-03-19 09:44:51 +0000 UTC" firstStartedPulling="2026-03-19 09:44:52.837743572 +0000 UTC m=+1210.223158388" lastFinishedPulling="2026-03-19 09:45:02.88376033 +0000 UTC m=+1220.269175146" observedRunningTime="2026-03-19 09:45:03.534196005 +0000 UTC m=+1220.919610841" watchObservedRunningTime="2026-03-19 09:45:03.562149583 +0000 UTC m=+1220.947564399" Mar 19 09:45:03.564542 master-0 kubenswrapper[15202]: I0319 09:45:03.564506 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-mzwh8" podStartSLOduration=1.9645681069999998 podStartE2EDuration="11.564497251s" podCreationTimestamp="2026-03-19 09:44:52 +0000 UTC" firstStartedPulling="2026-03-19 09:44:53.300735042 +0000 UTC m=+1210.686149858" lastFinishedPulling="2026-03-19 09:45:02.900664186 +0000 UTC m=+1220.286079002" observedRunningTime="2026-03-19 09:45:03.557331275 +0000 UTC m=+1220.942746091" watchObservedRunningTime="2026-03-19 09:45:03.564497251 +0000 UTC m=+1220.949912067" Mar 19 09:45:08.961044 master-0 kubenswrapper[15202]: I0319 09:45:08.960963 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg"] Mar 19 09:45:08.962649 master-0 kubenswrapper[15202]: I0319 09:45:08.962626 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" Mar 19 09:45:08.977813 master-0 kubenswrapper[15202]: I0319 09:45:08.977763 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 09:45:08.988135 master-0 kubenswrapper[15202]: I0319 09:45:08.988061 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 09:45:08.998895 master-0 kubenswrapper[15202]: I0319 09:45:08.998833 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg"] Mar 19 09:45:09.110141 master-0 kubenswrapper[15202]: I0319 09:45:09.110041 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdvkj\" (UniqueName: \"kubernetes.io/projected/b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13-kube-api-access-jdvkj\") pod \"obo-prometheus-operator-8ff7d675-wdrhg\" (UID: \"b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" Mar 19 09:45:09.212250 master-0 kubenswrapper[15202]: I0319 09:45:09.212081 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdvkj\" (UniqueName: \"kubernetes.io/projected/b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13-kube-api-access-jdvkj\") pod \"obo-prometheus-operator-8ff7d675-wdrhg\" (UID: \"b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" Mar 19 09:45:09.250176 master-0 kubenswrapper[15202]: I0319 09:45:09.250108 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdvkj\" (UniqueName: \"kubernetes.io/projected/b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13-kube-api-access-jdvkj\") pod \"obo-prometheus-operator-8ff7d675-wdrhg\" (UID: \"b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" Mar 19 09:45:09.387783 master-0 kubenswrapper[15202]: I0319 09:45:09.387709 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" Mar 19 09:45:09.611688 master-0 kubenswrapper[15202]: I0319 09:45:09.611605 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc"] Mar 19 09:45:09.613257 master-0 kubenswrapper[15202]: I0319 09:45:09.613215 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.631287 master-0 kubenswrapper[15202]: I0319 09:45:09.631224 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 09:45:09.656491 master-0 kubenswrapper[15202]: I0319 09:45:09.651543 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96"] Mar 19 09:45:09.656491 master-0 kubenswrapper[15202]: I0319 09:45:09.653032 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.675003 master-0 kubenswrapper[15202]: I0319 09:45:09.673013 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc"] Mar 19 09:45:09.688028 master-0 kubenswrapper[15202]: I0319 09:45:09.687347 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96"] Mar 19 09:45:09.759540 master-0 kubenswrapper[15202]: I0319 09:45:09.742638 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a8c6e71-17d2-46e6-9d69-ba5441a2535e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96\" (UID: \"8a8c6e71-17d2-46e6-9d69-ba5441a2535e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.759540 master-0 kubenswrapper[15202]: I0319 09:45:09.742736 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60088e9d-9a6b-4e01-9c7b-7d4f4305cdad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc\" (UID: \"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.759540 master-0 kubenswrapper[15202]: I0319 09:45:09.742793 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60088e9d-9a6b-4e01-9c7b-7d4f4305cdad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc\" (UID: \"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.759540 master-0 kubenswrapper[15202]: I0319 09:45:09.742866 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a8c6e71-17d2-46e6-9d69-ba5441a2535e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96\" (UID: \"8a8c6e71-17d2-46e6-9d69-ba5441a2535e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.847587 master-0 kubenswrapper[15202]: I0319 09:45:09.846551 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a8c6e71-17d2-46e6-9d69-ba5441a2535e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96\" (UID: \"8a8c6e71-17d2-46e6-9d69-ba5441a2535e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.847587 master-0 kubenswrapper[15202]: I0319 09:45:09.846663 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a8c6e71-17d2-46e6-9d69-ba5441a2535e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96\" (UID: \"8a8c6e71-17d2-46e6-9d69-ba5441a2535e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.847587 master-0 kubenswrapper[15202]: I0319 09:45:09.846708 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60088e9d-9a6b-4e01-9c7b-7d4f4305cdad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc\" (UID: \"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.847587 master-0 kubenswrapper[15202]: I0319 09:45:09.846752 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60088e9d-9a6b-4e01-9c7b-7d4f4305cdad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc\" (UID: \"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.857091 master-0 kubenswrapper[15202]: I0319 09:45:09.857043 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a8c6e71-17d2-46e6-9d69-ba5441a2535e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96\" (UID: \"8a8c6e71-17d2-46e6-9d69-ba5441a2535e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.857380 master-0 kubenswrapper[15202]: I0319 09:45:09.857042 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a8c6e71-17d2-46e6-9d69-ba5441a2535e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96\" (UID: \"8a8c6e71-17d2-46e6-9d69-ba5441a2535e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:09.857652 master-0 kubenswrapper[15202]: I0319 09:45:09.857579 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60088e9d-9a6b-4e01-9c7b-7d4f4305cdad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc\" (UID: \"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.857724 master-0 kubenswrapper[15202]: I0319 09:45:09.857674 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60088e9d-9a6b-4e01-9c7b-7d4f4305cdad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc\" (UID: \"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.944886 master-0 kubenswrapper[15202]: I0319 09:45:09.944443 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg"] Mar 19 09:45:09.946504 master-0 kubenswrapper[15202]: I0319 09:45:09.946185 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" Mar 19 09:45:09.993520 master-0 kubenswrapper[15202]: I0319 09:45:09.991292 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" Mar 19 09:45:10.046010 master-0 kubenswrapper[15202]: I0319 09:45:10.045726 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lm5gw"] Mar 19 09:45:10.047204 master-0 kubenswrapper[15202]: I0319 09:45:10.047153 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:10.052355 master-0 kubenswrapper[15202]: I0319 09:45:10.052313 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 09:45:10.152857 master-0 kubenswrapper[15202]: I0319 09:45:10.152785 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87j5\" (UniqueName: \"kubernetes.io/projected/4e41c756-34e7-48a9-a4dd-d15dce86d91c-kube-api-access-x87j5\") pod \"observability-operator-6dd7dd855f-lm5gw\" (UID: \"4e41c756-34e7-48a9-a4dd-d15dce86d91c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:10.153112 master-0 kubenswrapper[15202]: I0319 09:45:10.152970 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e41c756-34e7-48a9-a4dd-d15dce86d91c-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lm5gw\" (UID: \"4e41c756-34e7-48a9-a4dd-d15dce86d91c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:10.255023 master-0 kubenswrapper[15202]: I0319 09:45:10.254949 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e41c756-34e7-48a9-a4dd-d15dce86d91c-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lm5gw\" (UID: \"4e41c756-34e7-48a9-a4dd-d15dce86d91c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:10.255502 master-0 kubenswrapper[15202]: I0319 09:45:10.255454 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x87j5\" (UniqueName: \"kubernetes.io/projected/4e41c756-34e7-48a9-a4dd-d15dce86d91c-kube-api-access-x87j5\") pod \"observability-operator-6dd7dd855f-lm5gw\" (UID: \"4e41c756-34e7-48a9-a4dd-d15dce86d91c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:10.258696 master-0 kubenswrapper[15202]: I0319 09:45:10.258669 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e41c756-34e7-48a9-a4dd-d15dce86d91c-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lm5gw\" (UID: \"4e41c756-34e7-48a9-a4dd-d15dce86d91c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:10.474429 master-0 kubenswrapper[15202]: I0319 09:45:10.474347 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" event={"ID":"b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13","Type":"ContainerStarted","Data":"ab29f4f889d8313e41da4957e32de951045d05b41aa02c650b453ce039475986"} Mar 19 09:45:10.570589 master-0 kubenswrapper[15202]: I0319 09:45:10.570413 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lm5gw"] Mar 19 09:45:11.204663 master-0 kubenswrapper[15202]: I0319 09:45:11.204597 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x87j5\" (UniqueName: \"kubernetes.io/projected/4e41c756-34e7-48a9-a4dd-d15dce86d91c-kube-api-access-x87j5\") pod \"observability-operator-6dd7dd855f-lm5gw\" (UID: \"4e41c756-34e7-48a9-a4dd-d15dce86d91c\") " pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:11.214661 master-0 kubenswrapper[15202]: I0319 09:45:11.214576 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-z82hq"] Mar 19 09:45:11.218107 master-0 kubenswrapper[15202]: I0319 09:45:11.218071 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.223965 master-0 kubenswrapper[15202]: I0319 09:45:11.223407 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 09:45:11.233629 master-0 kubenswrapper[15202]: I0319 09:45:11.230059 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 09:45:11.268500 master-0 kubenswrapper[15202]: I0319 09:45:11.265991 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:11.276502 master-0 kubenswrapper[15202]: I0319 09:45:11.275538 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96"] Mar 19 09:45:11.282515 master-0 kubenswrapper[15202]: I0319 09:45:11.281252 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-z82hq\" (UID: \"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.282515 master-0 kubenswrapper[15202]: I0319 09:45:11.281326 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfkx\" (UniqueName: \"kubernetes.io/projected/e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c-kube-api-access-xcfkx\") pod \"cert-manager-cainjector-5545bd876-z82hq\" (UID: \"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.292501 master-0 kubenswrapper[15202]: I0319 09:45:11.291868 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc"] Mar 19 09:45:11.352554 master-0 kubenswrapper[15202]: I0319 09:45:11.340142 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-z82hq"] Mar 19 09:45:11.385437 master-0 kubenswrapper[15202]: I0319 09:45:11.385343 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfkx\" (UniqueName: \"kubernetes.io/projected/e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c-kube-api-access-xcfkx\") pod \"cert-manager-cainjector-5545bd876-z82hq\" (UID: \"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.385806 master-0 kubenswrapper[15202]: I0319 09:45:11.385518 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-z82hq\" (UID: \"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.414545 master-0 kubenswrapper[15202]: I0319 09:45:11.410624 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-z82hq\" (UID: \"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.415919 master-0 kubenswrapper[15202]: I0319 09:45:11.415880 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfkx\" (UniqueName: \"kubernetes.io/projected/e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c-kube-api-access-xcfkx\") pod \"cert-manager-cainjector-5545bd876-z82hq\" (UID: \"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.530514 master-0 kubenswrapper[15202]: I0319 09:45:11.528455 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" event={"ID":"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad","Type":"ContainerStarted","Data":"5fdf3394acfde0349cc762fbbde84a2c6e28635915d281dfea97cbb258ff94ef"} Mar 19 09:45:11.536661 master-0 kubenswrapper[15202]: I0319 09:45:11.536588 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" event={"ID":"8a8c6e71-17d2-46e6-9d69-ba5441a2535e","Type":"ContainerStarted","Data":"e7687bf81f5960fb7710a577ff58b2c83e1c433140fc417b93d3d7da8a9b2c46"} Mar 19 09:45:11.596091 master-0 kubenswrapper[15202]: I0319 09:45:11.596028 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" Mar 19 09:45:11.714486 master-0 kubenswrapper[15202]: I0319 09:45:11.714390 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-f44656786-v74wx"] Mar 19 09:45:11.736656 master-0 kubenswrapper[15202]: I0319 09:45:11.734686 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.739531 master-0 kubenswrapper[15202]: I0319 09:45:11.739187 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-f44656786-v74wx"] Mar 19 09:45:11.747250 master-0 kubenswrapper[15202]: I0319 09:45:11.746911 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 09:45:11.828498 master-0 kubenswrapper[15202]: I0319 09:45:11.828404 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24x9t\" (UniqueName: \"kubernetes.io/projected/faeed260-495c-4417-a289-e51868183e76-kube-api-access-24x9t\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.828774 master-0 kubenswrapper[15202]: I0319 09:45:11.828600 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/faeed260-495c-4417-a289-e51868183e76-webhook-cert\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.828774 master-0 kubenswrapper[15202]: I0319 09:45:11.828741 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/faeed260-495c-4417-a289-e51868183e76-apiservice-cert\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.829192 master-0 kubenswrapper[15202]: I0319 09:45:11.829136 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/faeed260-495c-4417-a289-e51868183e76-openshift-service-ca\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.933197 master-0 kubenswrapper[15202]: I0319 09:45:11.930992 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/faeed260-495c-4417-a289-e51868183e76-openshift-service-ca\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.933197 master-0 kubenswrapper[15202]: I0319 09:45:11.931076 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24x9t\" (UniqueName: \"kubernetes.io/projected/faeed260-495c-4417-a289-e51868183e76-kube-api-access-24x9t\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.933197 master-0 kubenswrapper[15202]: I0319 09:45:11.931100 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/faeed260-495c-4417-a289-e51868183e76-webhook-cert\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.933197 master-0 kubenswrapper[15202]: I0319 09:45:11.931126 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/faeed260-495c-4417-a289-e51868183e76-apiservice-cert\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.933197 master-0 kubenswrapper[15202]: I0319 09:45:11.932017 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/faeed260-495c-4417-a289-e51868183e76-openshift-service-ca\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.938309 master-0 kubenswrapper[15202]: I0319 09:45:11.937424 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/faeed260-495c-4417-a289-e51868183e76-apiservice-cert\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:11.945506 master-0 kubenswrapper[15202]: I0319 09:45:11.944615 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/faeed260-495c-4417-a289-e51868183e76-webhook-cert\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:12.000008 master-0 kubenswrapper[15202]: I0319 09:45:11.970175 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lm5gw"] Mar 19 09:45:12.015494 master-0 kubenswrapper[15202]: I0319 09:45:12.005627 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24x9t\" (UniqueName: \"kubernetes.io/projected/faeed260-495c-4417-a289-e51868183e76-kube-api-access-24x9t\") pod \"perses-operator-f44656786-v74wx\" (UID: \"faeed260-495c-4417-a289-e51868183e76\") " pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:12.015494 master-0 kubenswrapper[15202]: W0319 09:45:12.012752 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e41c756_34e7_48a9_a4dd_d15dce86d91c.slice/crio-457334932c4413199f88066b6acc57ce6dcd69000ad3328fdbf8de7da2b5e242 WatchSource:0}: Error finding container 457334932c4413199f88066b6acc57ce6dcd69000ad3328fdbf8de7da2b5e242: Status 404 returned error can't find the container with id 457334932c4413199f88066b6acc57ce6dcd69000ad3328fdbf8de7da2b5e242 Mar 19 09:45:12.104516 master-0 kubenswrapper[15202]: I0319 09:45:12.100989 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:12.179288 master-0 kubenswrapper[15202]: I0319 09:45:12.178546 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-mgklh"] Mar 19 09:45:12.180023 master-0 kubenswrapper[15202]: I0319 09:45:12.179939 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.188554 master-0 kubenswrapper[15202]: I0319 09:45:12.184885 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-mgklh"] Mar 19 09:45:12.226534 master-0 kubenswrapper[15202]: W0319 09:45:12.225554 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3121b63_7ca6_46e2_a77d_a9ac2bd4f90c.slice/crio-0b09532344736bbc4fa81ab5168ad62e2a2bc9df1278991015c5ded3434feb95 WatchSource:0}: Error finding container 0b09532344736bbc4fa81ab5168ad62e2a2bc9df1278991015c5ded3434feb95: Status 404 returned error can't find the container with id 0b09532344736bbc4fa81ab5168ad62e2a2bc9df1278991015c5ded3434feb95 Mar 19 09:45:12.226996 master-0 kubenswrapper[15202]: I0319 09:45:12.226600 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-z82hq"] Mar 19 09:45:12.286507 master-0 kubenswrapper[15202]: I0319 09:45:12.286352 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ltr4\" (UniqueName: \"kubernetes.io/projected/fca9305a-b6de-4bca-a3cc-647f0a3bd7ad-kube-api-access-2ltr4\") pod \"cert-manager-webhook-6888856db4-mgklh\" (UID: \"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad\") " pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.286507 master-0 kubenswrapper[15202]: I0319 09:45:12.286446 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fca9305a-b6de-4bca-a3cc-647f0a3bd7ad-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-mgklh\" (UID: \"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad\") " pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.388528 master-0 kubenswrapper[15202]: I0319 09:45:12.388377 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ltr4\" (UniqueName: \"kubernetes.io/projected/fca9305a-b6de-4bca-a3cc-647f0a3bd7ad-kube-api-access-2ltr4\") pod \"cert-manager-webhook-6888856db4-mgklh\" (UID: \"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad\") " pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.388528 master-0 kubenswrapper[15202]: I0319 09:45:12.388461 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fca9305a-b6de-4bca-a3cc-647f0a3bd7ad-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-mgklh\" (UID: \"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad\") " pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.416512 master-0 kubenswrapper[15202]: I0319 09:45:12.412829 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fca9305a-b6de-4bca-a3cc-647f0a3bd7ad-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-mgklh\" (UID: \"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad\") " pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.431695 master-0 kubenswrapper[15202]: I0319 09:45:12.429732 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ltr4\" (UniqueName: \"kubernetes.io/projected/fca9305a-b6de-4bca-a3cc-647f0a3bd7ad-kube-api-access-2ltr4\") pod \"cert-manager-webhook-6888856db4-mgklh\" (UID: \"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad\") " pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.513490 master-0 kubenswrapper[15202]: I0319 09:45:12.510708 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:12.570953 master-0 kubenswrapper[15202]: I0319 09:45:12.570860 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" event={"ID":"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c","Type":"ContainerStarted","Data":"0b09532344736bbc4fa81ab5168ad62e2a2bc9df1278991015c5ded3434feb95"} Mar 19 09:45:12.573335 master-0 kubenswrapper[15202]: I0319 09:45:12.573247 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" event={"ID":"4e41c756-34e7-48a9-a4dd-d15dce86d91c","Type":"ContainerStarted","Data":"457334932c4413199f88066b6acc57ce6dcd69000ad3328fdbf8de7da2b5e242"} Mar 19 09:45:12.914493 master-0 kubenswrapper[15202]: I0319 09:45:12.894020 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-f44656786-v74wx"] Mar 19 09:45:13.151062 master-0 kubenswrapper[15202]: W0319 09:45:13.150618 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfca9305a_b6de_4bca_a3cc_647f0a3bd7ad.slice/crio-addd6ebd2c9113cf2a5ec3c9cb6de579721a7efb24b53f28ea9dea2900d54ef9 WatchSource:0}: Error finding container addd6ebd2c9113cf2a5ec3c9cb6de579721a7efb24b53f28ea9dea2900d54ef9: Status 404 returned error can't find the container with id addd6ebd2c9113cf2a5ec3c9cb6de579721a7efb24b53f28ea9dea2900d54ef9 Mar 19 09:45:13.152561 master-0 kubenswrapper[15202]: I0319 09:45:13.152518 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-mgklh"] Mar 19 09:45:13.592459 master-0 kubenswrapper[15202]: I0319 09:45:13.590511 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" event={"ID":"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad","Type":"ContainerStarted","Data":"addd6ebd2c9113cf2a5ec3c9cb6de579721a7efb24b53f28ea9dea2900d54ef9"} Mar 19 09:45:13.594785 master-0 kubenswrapper[15202]: I0319 09:45:13.594729 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-f44656786-v74wx" event={"ID":"faeed260-495c-4417-a289-e51868183e76","Type":"ContainerStarted","Data":"6181553dba8d46d29f222f410116596c9341e1abfd14ddf888b7d08c6d7efef3"} Mar 19 09:45:22.358374 master-0 kubenswrapper[15202]: I0319 09:45:22.357642 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-754b74fdf5-vvbj2" Mar 19 09:45:24.737117 master-0 kubenswrapper[15202]: I0319 09:45:24.737061 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" event={"ID":"8a8c6e71-17d2-46e6-9d69-ba5441a2535e","Type":"ContainerStarted","Data":"9ae4f99acf96821301eff13aebfa51729728444d1f02f366fe820024caa22b0d"} Mar 19 09:45:24.742892 master-0 kubenswrapper[15202]: I0319 09:45:24.740599 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-f44656786-v74wx" event={"ID":"faeed260-495c-4417-a289-e51868183e76","Type":"ContainerStarted","Data":"bc6bea18cd24e1503c4d12f6872386a37262a6ca87de30325f402acf7da3d440"} Mar 19 09:45:24.742892 master-0 kubenswrapper[15202]: I0319 09:45:24.741216 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:24.747794 master-0 kubenswrapper[15202]: I0319 09:45:24.747734 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" event={"ID":"e3121b63-7ca6-46e2-a77d-a9ac2bd4f90c","Type":"ContainerStarted","Data":"9cd1358c81126dacbb40794e471138c1e306254c116754f0f3afc490791bb119"} Mar 19 09:45:24.750700 master-0 kubenswrapper[15202]: I0319 09:45:24.750657 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" event={"ID":"4e41c756-34e7-48a9-a4dd-d15dce86d91c","Type":"ContainerStarted","Data":"7a3d49701d55d7309a12a8326d92308a95b95de95e08835e278090f91c345cac"} Mar 19 09:45:24.752820 master-0 kubenswrapper[15202]: I0319 09:45:24.752487 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:24.754150 master-0 kubenswrapper[15202]: I0319 09:45:24.754112 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" Mar 19 09:45:24.756542 master-0 kubenswrapper[15202]: I0319 09:45:24.756490 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" event={"ID":"60088e9d-9a6b-4e01-9c7b-7d4f4305cdad","Type":"ContainerStarted","Data":"2938344f8d55c3958ce5c9936999de40db122f8e59f83ce7b0249b02a644da0e"} Mar 19 09:45:24.761170 master-0 kubenswrapper[15202]: I0319 09:45:24.761097 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" event={"ID":"fca9305a-b6de-4bca-a3cc-647f0a3bd7ad","Type":"ContainerStarted","Data":"ce46c73587f5df6e22196949d3391fc3c1700a854795da06fea40cfab63fde35"} Mar 19 09:45:24.761321 master-0 kubenswrapper[15202]: I0319 09:45:24.761243 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:24.765129 master-0 kubenswrapper[15202]: I0319 09:45:24.764551 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" event={"ID":"b42d9c1c-3ad2-4b4a-bd1e-fd670c605a13","Type":"ContainerStarted","Data":"a80af05b449577f2b67ce9177063a777910462a940f64e44792b4a62c3a391af"} Mar 19 09:45:24.781089 master-0 kubenswrapper[15202]: I0319 09:45:24.780995 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-9wk96" podStartSLOduration=3.621124277 podStartE2EDuration="15.780972644s" podCreationTimestamp="2026-03-19 09:45:09 +0000 UTC" firstStartedPulling="2026-03-19 09:45:11.247868232 +0000 UTC m=+1228.633283058" lastFinishedPulling="2026-03-19 09:45:23.407716609 +0000 UTC m=+1240.793131425" observedRunningTime="2026-03-19 09:45:24.771532891 +0000 UTC m=+1242.156947717" watchObservedRunningTime="2026-03-19 09:45:24.780972644 +0000 UTC m=+1242.166387460" Mar 19 09:45:24.828197 master-0 kubenswrapper[15202]: I0319 09:45:24.827809 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-lm5gw" podStartSLOduration=4.431786458 podStartE2EDuration="15.827783396s" podCreationTimestamp="2026-03-19 09:45:09 +0000 UTC" firstStartedPulling="2026-03-19 09:45:12.063114517 +0000 UTC m=+1229.448529333" lastFinishedPulling="2026-03-19 09:45:23.459111435 +0000 UTC m=+1240.844526271" observedRunningTime="2026-03-19 09:45:24.820913428 +0000 UTC m=+1242.206328254" watchObservedRunningTime="2026-03-19 09:45:24.827783396 +0000 UTC m=+1242.213198212" Mar 19 09:45:24.868516 master-0 kubenswrapper[15202]: I0319 09:45:24.868341 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wdrhg" podStartSLOduration=3.450236155 podStartE2EDuration="16.868319724s" podCreationTimestamp="2026-03-19 09:45:08 +0000 UTC" firstStartedPulling="2026-03-19 09:45:09.981164012 +0000 UTC m=+1227.366578828" lastFinishedPulling="2026-03-19 09:45:23.399247581 +0000 UTC m=+1240.784662397" observedRunningTime="2026-03-19 09:45:24.863084736 +0000 UTC m=+1242.248499572" watchObservedRunningTime="2026-03-19 09:45:24.868319724 +0000 UTC m=+1242.253734540" Mar 19 09:45:24.963421 master-0 kubenswrapper[15202]: I0319 09:45:24.963186 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d55d7cd7f-bv7hc" podStartSLOduration=3.773833486 podStartE2EDuration="15.96316602s" podCreationTimestamp="2026-03-19 09:45:09 +0000 UTC" firstStartedPulling="2026-03-19 09:45:11.248269472 +0000 UTC m=+1228.633684298" lastFinishedPulling="2026-03-19 09:45:23.437602016 +0000 UTC m=+1240.823016832" observedRunningTime="2026-03-19 09:45:24.926981889 +0000 UTC m=+1242.312396705" watchObservedRunningTime="2026-03-19 09:45:24.96316602 +0000 UTC m=+1242.348580846" Mar 19 09:45:24.990345 master-0 kubenswrapper[15202]: I0319 09:45:24.990189 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-f44656786-v74wx" podStartSLOduration=3.478583222 podStartE2EDuration="13.990163084s" podCreationTimestamp="2026-03-19 09:45:11 +0000 UTC" firstStartedPulling="2026-03-19 09:45:12.898620809 +0000 UTC m=+1230.284035625" lastFinishedPulling="2026-03-19 09:45:23.410200671 +0000 UTC m=+1240.795615487" observedRunningTime="2026-03-19 09:45:24.971777272 +0000 UTC m=+1242.357192088" watchObservedRunningTime="2026-03-19 09:45:24.990163084 +0000 UTC m=+1242.375577900" Mar 19 09:45:24.997552 master-0 kubenswrapper[15202]: I0319 09:45:24.997356 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" podStartSLOduration=2.713029197 podStartE2EDuration="12.997339182s" podCreationTimestamp="2026-03-19 09:45:12 +0000 UTC" firstStartedPulling="2026-03-19 09:45:13.153408543 +0000 UTC m=+1230.538823369" lastFinishedPulling="2026-03-19 09:45:23.437718538 +0000 UTC m=+1240.823133354" observedRunningTime="2026-03-19 09:45:24.989390216 +0000 UTC m=+1242.374805032" watchObservedRunningTime="2026-03-19 09:45:24.997339182 +0000 UTC m=+1242.382753998" Mar 19 09:45:25.020654 master-0 kubenswrapper[15202]: I0319 09:45:25.020568 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-z82hq" podStartSLOduration=3.8537476489999998 podStartE2EDuration="15.020547083s" podCreationTimestamp="2026-03-19 09:45:10 +0000 UTC" firstStartedPulling="2026-03-19 09:45:12.243512599 +0000 UTC m=+1229.628927415" lastFinishedPulling="2026-03-19 09:45:23.410312033 +0000 UTC m=+1240.795726849" observedRunningTime="2026-03-19 09:45:25.017455687 +0000 UTC m=+1242.402870513" watchObservedRunningTime="2026-03-19 09:45:25.020547083 +0000 UTC m=+1242.405961899" Mar 19 09:45:26.284022 master-0 kubenswrapper[15202]: I0319 09:45:26.283923 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-29rbn"] Mar 19 09:45:26.285077 master-0 kubenswrapper[15202]: I0319 09:45:26.285047 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.301329 master-0 kubenswrapper[15202]: I0319 09:45:26.301205 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfvh\" (UniqueName: \"kubernetes.io/projected/2b8e8918-18a9-47b1-86c0-be0fe45968df-kube-api-access-bvfvh\") pod \"cert-manager-545d4d4674-29rbn\" (UID: \"2b8e8918-18a9-47b1-86c0-be0fe45968df\") " pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.301747 master-0 kubenswrapper[15202]: I0319 09:45:26.301411 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b8e8918-18a9-47b1-86c0-be0fe45968df-bound-sa-token\") pod \"cert-manager-545d4d4674-29rbn\" (UID: \"2b8e8918-18a9-47b1-86c0-be0fe45968df\") " pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.307979 master-0 kubenswrapper[15202]: I0319 09:45:26.307908 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-29rbn"] Mar 19 09:45:26.402770 master-0 kubenswrapper[15202]: I0319 09:45:26.402643 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b8e8918-18a9-47b1-86c0-be0fe45968df-bound-sa-token\") pod \"cert-manager-545d4d4674-29rbn\" (UID: \"2b8e8918-18a9-47b1-86c0-be0fe45968df\") " pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.402770 master-0 kubenswrapper[15202]: I0319 09:45:26.402808 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfvh\" (UniqueName: \"kubernetes.io/projected/2b8e8918-18a9-47b1-86c0-be0fe45968df-kube-api-access-bvfvh\") pod \"cert-manager-545d4d4674-29rbn\" (UID: \"2b8e8918-18a9-47b1-86c0-be0fe45968df\") " pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.431899 master-0 kubenswrapper[15202]: I0319 09:45:26.426430 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfvh\" (UniqueName: \"kubernetes.io/projected/2b8e8918-18a9-47b1-86c0-be0fe45968df-kube-api-access-bvfvh\") pod \"cert-manager-545d4d4674-29rbn\" (UID: \"2b8e8918-18a9-47b1-86c0-be0fe45968df\") " pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.431899 master-0 kubenswrapper[15202]: I0319 09:45:26.427914 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b8e8918-18a9-47b1-86c0-be0fe45968df-bound-sa-token\") pod \"cert-manager-545d4d4674-29rbn\" (UID: \"2b8e8918-18a9-47b1-86c0-be0fe45968df\") " pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:26.610365 master-0 kubenswrapper[15202]: I0319 09:45:26.610125 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-29rbn" Mar 19 09:45:27.121721 master-0 kubenswrapper[15202]: I0319 09:45:27.119648 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-29rbn"] Mar 19 09:45:27.122672 master-0 kubenswrapper[15202]: W0319 09:45:27.122378 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b8e8918_18a9_47b1_86c0_be0fe45968df.slice/crio-4d017944f4c8b0de3651ba134195c1f8efb617e60786d0890f934cfd64960694 WatchSource:0}: Error finding container 4d017944f4c8b0de3651ba134195c1f8efb617e60786d0890f934cfd64960694: Status 404 returned error can't find the container with id 4d017944f4c8b0de3651ba134195c1f8efb617e60786d0890f934cfd64960694 Mar 19 09:45:27.807953 master-0 kubenswrapper[15202]: I0319 09:45:27.807875 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-29rbn" event={"ID":"2b8e8918-18a9-47b1-86c0-be0fe45968df","Type":"ContainerStarted","Data":"bec9afd341d2f2c237845d74f9831994913c9cb61a15ee53f57c1daf60d29aa1"} Mar 19 09:45:27.808870 master-0 kubenswrapper[15202]: I0319 09:45:27.808844 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-29rbn" event={"ID":"2b8e8918-18a9-47b1-86c0-be0fe45968df","Type":"ContainerStarted","Data":"4d017944f4c8b0de3651ba134195c1f8efb617e60786d0890f934cfd64960694"} Mar 19 09:45:27.846221 master-0 kubenswrapper[15202]: I0319 09:45:27.846117 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-29rbn" podStartSLOduration=1.846085087 podStartE2EDuration="1.846085087s" podCreationTimestamp="2026-03-19 09:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:45:27.839165377 +0000 UTC m=+1245.224580193" watchObservedRunningTime="2026-03-19 09:45:27.846085087 +0000 UTC m=+1245.231499913" Mar 19 09:45:32.105217 master-0 kubenswrapper[15202]: I0319 09:45:32.105155 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-f44656786-v74wx" Mar 19 09:45:32.514285 master-0 kubenswrapper[15202]: I0319 09:45:32.514192 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" Mar 19 09:45:42.181498 master-0 kubenswrapper[15202]: I0319 09:45:42.180988 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d7b76b756-hw274" Mar 19 09:45:50.946501 master-0 kubenswrapper[15202]: I0319 09:45:50.944831 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dttqv"] Mar 19 09:45:50.957176 master-0 kubenswrapper[15202]: I0319 09:45:50.953654 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:50.960046 master-0 kubenswrapper[15202]: I0319 09:45:50.957996 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 09:45:50.960046 master-0 kubenswrapper[15202]: I0319 09:45:50.959536 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 09:45:50.963394 master-0 kubenswrapper[15202]: I0319 09:45:50.963308 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9"] Mar 19 09:45:50.965265 master-0 kubenswrapper[15202]: I0319 09:45:50.964713 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:50.969744 master-0 kubenswrapper[15202]: I0319 09:45:50.966614 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 09:45:51.002906 master-0 kubenswrapper[15202]: I0319 09:45:51.002270 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9"] Mar 19 09:45:51.059851 master-0 kubenswrapper[15202]: I0319 09:45:51.059779 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.060260 master-0 kubenswrapper[15202]: I0319 09:45:51.060241 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics-certs\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.060422 master-0 kubenswrapper[15202]: I0319 09:45:51.060408 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-sockets\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.064587 master-0 kubenswrapper[15202]: I0319 09:45:51.064567 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-startup\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.064697 master-0 kubenswrapper[15202]: I0319 09:45:51.064683 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-reloader\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.064852 master-0 kubenswrapper[15202]: I0319 09:45:51.064836 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9d7\" (UniqueName: \"kubernetes.io/projected/fb00ada9-e047-47a7-82b0-44a3a66d6669-kube-api-access-5j9d7\") pod \"frr-k8s-webhook-server-bcc4b6f68-sfpc9\" (UID: \"fb00ada9-e047-47a7-82b0-44a3a66d6669\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.068534 master-0 kubenswrapper[15202]: I0319 09:45:51.065264 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w24cz\" (UniqueName: \"kubernetes.io/projected/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-kube-api-access-w24cz\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.068534 master-0 kubenswrapper[15202]: I0319 09:45:51.065456 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb00ada9-e047-47a7-82b0-44a3a66d6669-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-sfpc9\" (UID: \"fb00ada9-e047-47a7-82b0-44a3a66d6669\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.068534 master-0 kubenswrapper[15202]: I0319 09:45:51.065575 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-conf\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.107497 master-0 kubenswrapper[15202]: I0319 09:45:51.105745 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jkzd2"] Mar 19 09:45:51.110758 master-0 kubenswrapper[15202]: I0319 09:45:51.109356 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.112920 master-0 kubenswrapper[15202]: I0319 09:45:51.112831 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 09:45:51.114699 master-0 kubenswrapper[15202]: I0319 09:45:51.114629 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 09:45:51.117309 master-0 kubenswrapper[15202]: I0319 09:45:51.114772 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 09:45:51.133634 master-0 kubenswrapper[15202]: I0319 09:45:51.128038 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-jkh97"] Mar 19 09:45:51.133634 master-0 kubenswrapper[15202]: I0319 09:45:51.129990 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.133634 master-0 kubenswrapper[15202]: I0319 09:45:51.133547 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 09:45:51.138669 master-0 kubenswrapper[15202]: I0319 09:45:51.138614 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-jkh97"] Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168070 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-sockets\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168117 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-startup\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168139 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-reloader\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168163 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-metallb-excludel2\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168202 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9d7\" (UniqueName: \"kubernetes.io/projected/fb00ada9-e047-47a7-82b0-44a3a66d6669-kube-api-access-5j9d7\") pod \"frr-k8s-webhook-server-bcc4b6f68-sfpc9\" (UID: \"fb00ada9-e047-47a7-82b0-44a3a66d6669\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168230 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-metrics-certs\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168667 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-sockets\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168712 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w24cz\" (UniqueName: \"kubernetes.io/projected/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-kube-api-access-w24cz\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168736 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-cert\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168762 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm97g\" (UniqueName: \"kubernetes.io/projected/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-kube-api-access-zm97g\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168784 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168806 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh77x\" (UniqueName: \"kubernetes.io/projected/5cc321ea-4a7a-440d-a58c-a9d141f87363-kube-api-access-wh77x\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168829 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb00ada9-e047-47a7-82b0-44a3a66d6669-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-sfpc9\" (UID: \"fb00ada9-e047-47a7-82b0-44a3a66d6669\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168856 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-conf\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168893 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168913 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics-certs\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.168944 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-metrics-certs\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.169227 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-reloader\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.169305 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-startup\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.169897 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: I0319 09:45:51.170128 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-frr-conf\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: E0319 09:45:51.170215 15202 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 19 09:45:51.170504 master-0 kubenswrapper[15202]: E0319 09:45:51.170318 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics-certs podName:b9d34e98-54a4-4e3b-ae50-92832b3dce0b nodeName:}" failed. No retries permitted until 2026-03-19 09:45:51.670299876 +0000 UTC m=+1269.055714692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics-certs") pod "frr-k8s-dttqv" (UID: "b9d34e98-54a4-4e3b-ae50-92832b3dce0b") : secret "frr-k8s-certs-secret" not found Mar 19 09:45:51.175440 master-0 kubenswrapper[15202]: I0319 09:45:51.175397 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb00ada9-e047-47a7-82b0-44a3a66d6669-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-sfpc9\" (UID: \"fb00ada9-e047-47a7-82b0-44a3a66d6669\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.187032 master-0 kubenswrapper[15202]: I0319 09:45:51.186985 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w24cz\" (UniqueName: \"kubernetes.io/projected/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-kube-api-access-w24cz\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.187894 master-0 kubenswrapper[15202]: I0319 09:45:51.187843 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9d7\" (UniqueName: \"kubernetes.io/projected/fb00ada9-e047-47a7-82b0-44a3a66d6669-kube-api-access-5j9d7\") pod \"frr-k8s-webhook-server-bcc4b6f68-sfpc9\" (UID: \"fb00ada9-e047-47a7-82b0-44a3a66d6669\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.271281 master-0 kubenswrapper[15202]: I0319 09:45:51.271122 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-metrics-certs\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.271281 master-0 kubenswrapper[15202]: I0319 09:45:51.271210 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-cert\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.271281 master-0 kubenswrapper[15202]: I0319 09:45:51.271240 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm97g\" (UniqueName: \"kubernetes.io/projected/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-kube-api-access-zm97g\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.271281 master-0 kubenswrapper[15202]: I0319 09:45:51.271274 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.271281 master-0 kubenswrapper[15202]: I0319 09:45:51.271292 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh77x\" (UniqueName: \"kubernetes.io/projected/5cc321ea-4a7a-440d-a58c-a9d141f87363-kube-api-access-wh77x\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.271727 master-0 kubenswrapper[15202]: I0319 09:45:51.271373 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-metrics-certs\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.271727 master-0 kubenswrapper[15202]: I0319 09:45:51.271430 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-metallb-excludel2\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.271960 master-0 kubenswrapper[15202]: E0319 09:45:51.271917 15202 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:45:51.272121 master-0 kubenswrapper[15202]: E0319 09:45:51.272105 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist podName:f2a48cff-3780-4bd2-b12c-e6b77a990d8b nodeName:}" failed. No retries permitted until 2026-03-19 09:45:51.772074062 +0000 UTC m=+1269.157489048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist") pod "speaker-jkzd2" (UID: "f2a48cff-3780-4bd2-b12c-e6b77a990d8b") : secret "metallb-memberlist" not found Mar 19 09:45:51.273537 master-0 kubenswrapper[15202]: E0319 09:45:51.272930 15202 secret.go:189] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 19 09:45:51.273537 master-0 kubenswrapper[15202]: E0319 09:45:51.273082 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-metrics-certs podName:5cc321ea-4a7a-440d-a58c-a9d141f87363 nodeName:}" failed. No retries permitted until 2026-03-19 09:45:51.773043846 +0000 UTC m=+1269.158458682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-metrics-certs") pod "controller-7bb4cc7c98-jkh97" (UID: "5cc321ea-4a7a-440d-a58c-a9d141f87363") : secret "controller-certs-secret" not found Mar 19 09:45:51.273537 master-0 kubenswrapper[15202]: I0319 09:45:51.273451 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-metallb-excludel2\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.275978 master-0 kubenswrapper[15202]: I0319 09:45:51.275948 15202 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 09:45:51.281149 master-0 kubenswrapper[15202]: I0319 09:45:51.281117 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-metrics-certs\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.290537 master-0 kubenswrapper[15202]: I0319 09:45:51.287592 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-cert\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.291081 master-0 kubenswrapper[15202]: I0319 09:45:51.290971 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm97g\" (UniqueName: \"kubernetes.io/projected/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-kube-api-access-zm97g\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.293684 master-0 kubenswrapper[15202]: I0319 09:45:51.293652 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh77x\" (UniqueName: \"kubernetes.io/projected/5cc321ea-4a7a-440d-a58c-a9d141f87363-kube-api-access-wh77x\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.312680 master-0 kubenswrapper[15202]: I0319 09:45:51.311760 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:45:51.680408 master-0 kubenswrapper[15202]: I0319 09:45:51.680356 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics-certs\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.684837 master-0 kubenswrapper[15202]: I0319 09:45:51.684767 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9d34e98-54a4-4e3b-ae50-92832b3dce0b-metrics-certs\") pod \"frr-k8s-dttqv\" (UID: \"b9d34e98-54a4-4e3b-ae50-92832b3dce0b\") " pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:51.782708 master-0 kubenswrapper[15202]: I0319 09:45:51.782653 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-metrics-certs\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.783130 master-0 kubenswrapper[15202]: I0319 09:45:51.783107 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:51.783779 master-0 kubenswrapper[15202]: E0319 09:45:51.783414 15202 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:45:51.783779 master-0 kubenswrapper[15202]: E0319 09:45:51.783630 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist podName:f2a48cff-3780-4bd2-b12c-e6b77a990d8b nodeName:}" failed. No retries permitted until 2026-03-19 09:45:52.783592448 +0000 UTC m=+1270.169007274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist") pod "speaker-jkzd2" (UID: "f2a48cff-3780-4bd2-b12c-e6b77a990d8b") : secret "metallb-memberlist" not found Mar 19 09:45:51.788700 master-0 kubenswrapper[15202]: I0319 09:45:51.788624 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5cc321ea-4a7a-440d-a58c-a9d141f87363-metrics-certs\") pod \"controller-7bb4cc7c98-jkh97\" (UID: \"5cc321ea-4a7a-440d-a58c-a9d141f87363\") " pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.803271 master-0 kubenswrapper[15202]: W0319 09:45:51.803226 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb00ada9_e047_47a7_82b0_44a3a66d6669.slice/crio-2ddb409a51adf77b012afab39b58211dfd0b2dd36e2d0079d0f5cc6187d2eecf WatchSource:0}: Error finding container 2ddb409a51adf77b012afab39b58211dfd0b2dd36e2d0079d0f5cc6187d2eecf: Status 404 returned error can't find the container with id 2ddb409a51adf77b012afab39b58211dfd0b2dd36e2d0079d0f5cc6187d2eecf Mar 19 09:45:51.810258 master-0 kubenswrapper[15202]: I0319 09:45:51.810230 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9"] Mar 19 09:45:51.822734 master-0 kubenswrapper[15202]: I0319 09:45:51.822655 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:51.895126 master-0 kubenswrapper[15202]: I0319 09:45:51.895052 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dttqv" Mar 19 09:45:52.055776 master-0 kubenswrapper[15202]: I0319 09:45:52.055727 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" event={"ID":"fb00ada9-e047-47a7-82b0-44a3a66d6669","Type":"ContainerStarted","Data":"2ddb409a51adf77b012afab39b58211dfd0b2dd36e2d0079d0f5cc6187d2eecf"} Mar 19 09:45:52.388210 master-0 kubenswrapper[15202]: I0319 09:45:52.388132 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-jkh97"] Mar 19 09:45:52.395247 master-0 kubenswrapper[15202]: W0319 09:45:52.395162 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc321ea_4a7a_440d_a58c_a9d141f87363.slice/crio-d6945382f484508812e18fd563168225e4d4c68e3d0bc1428f0959e0367d1639 WatchSource:0}: Error finding container d6945382f484508812e18fd563168225e4d4c68e3d0bc1428f0959e0367d1639: Status 404 returned error can't find the container with id d6945382f484508812e18fd563168225e4d4c68e3d0bc1428f0959e0367d1639 Mar 19 09:45:52.806780 master-0 kubenswrapper[15202]: I0319 09:45:52.806648 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:52.814091 master-0 kubenswrapper[15202]: I0319 09:45:52.814007 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f2a48cff-3780-4bd2-b12c-e6b77a990d8b-memberlist\") pod \"speaker-jkzd2\" (UID: \"f2a48cff-3780-4bd2-b12c-e6b77a990d8b\") " pod="metallb-system/speaker-jkzd2" Mar 19 09:45:52.965427 master-0 kubenswrapper[15202]: I0319 09:45:52.965269 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jkzd2" Mar 19 09:45:53.084835 master-0 kubenswrapper[15202]: I0319 09:45:53.084702 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"eaf0b6522768824eaa5603e6a322cd574ce0f33ddf8636e4f1226d9e2149b7ff"} Mar 19 09:45:53.088350 master-0 kubenswrapper[15202]: I0319 09:45:53.088319 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jkzd2" event={"ID":"f2a48cff-3780-4bd2-b12c-e6b77a990d8b","Type":"ContainerStarted","Data":"5054c8e65b3ac46071eb9cb5779e9a28b2b95f7cc2e786e4f78116e3cce66d64"} Mar 19 09:45:53.092683 master-0 kubenswrapper[15202]: I0319 09:45:53.092653 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jkh97" event={"ID":"5cc321ea-4a7a-440d-a58c-a9d141f87363","Type":"ContainerStarted","Data":"64310653e04ccda9eeb8fc9a08550d55cdcfcbe214a5349674ba2e8bfc10420d"} Mar 19 09:45:53.092683 master-0 kubenswrapper[15202]: I0319 09:45:53.092681 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jkh97" event={"ID":"5cc321ea-4a7a-440d-a58c-a9d141f87363","Type":"ContainerStarted","Data":"d6945382f484508812e18fd563168225e4d4c68e3d0bc1428f0959e0367d1639"} Mar 19 09:45:54.049121 master-0 kubenswrapper[15202]: I0319 09:45:54.049055 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6"] Mar 19 09:45:54.079153 master-0 kubenswrapper[15202]: I0319 09:45:54.079098 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" Mar 19 09:45:54.079573 master-0 kubenswrapper[15202]: I0319 09:45:54.079512 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6"] Mar 19 09:45:54.100254 master-0 kubenswrapper[15202]: I0319 09:45:54.099062 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6"] Mar 19 09:45:54.101817 master-0 kubenswrapper[15202]: I0319 09:45:54.101324 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.106334 master-0 kubenswrapper[15202]: I0319 09:45:54.106287 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 09:45:54.148358 master-0 kubenswrapper[15202]: I0319 09:45:54.148288 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jkzd2" event={"ID":"f2a48cff-3780-4bd2-b12c-e6b77a990d8b","Type":"ContainerStarted","Data":"d1acd628431e75122a7ebf3ac29cedf55b03d8d2e35049034aba18617f20c70f"} Mar 19 09:45:54.150442 master-0 kubenswrapper[15202]: I0319 09:45:54.150367 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6km4g\" (UniqueName: \"kubernetes.io/projected/4dc72b1a-a76e-4246-be58-5576544be5a8-kube-api-access-6km4g\") pod \"nmstate-webhook-5f558f5558-5wgm6\" (UID: \"4dc72b1a-a76e-4246-be58-5576544be5a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.150546 master-0 kubenswrapper[15202]: I0319 09:45:54.150480 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplq2\" (UniqueName: \"kubernetes.io/projected/e0518d63-cd74-47d9-8d59-bc542409fec0-kube-api-access-hplq2\") pod \"nmstate-metrics-9b8c8685d-cpgt6\" (UID: \"e0518d63-cd74-47d9-8d59-bc542409fec0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" Mar 19 09:45:54.151118 master-0 kubenswrapper[15202]: I0319 09:45:54.150602 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4dc72b1a-a76e-4246-be58-5576544be5a8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5wgm6\" (UID: \"4dc72b1a-a76e-4246-be58-5576544be5a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.153282 master-0 kubenswrapper[15202]: I0319 09:45:54.153234 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6"] Mar 19 09:45:54.153787 master-0 kubenswrapper[15202]: I0319 09:45:54.153758 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jkh97" event={"ID":"5cc321ea-4a7a-440d-a58c-a9d141f87363","Type":"ContainerStarted","Data":"cd82d985ff6062f572178748a1495cccc50841a6478e35635cdf706bb20a5669"} Mar 19 09:45:54.154912 master-0 kubenswrapper[15202]: I0319 09:45:54.154688 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:45:54.166033 master-0 kubenswrapper[15202]: I0319 09:45:54.165967 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gns5r"] Mar 19 09:45:54.173159 master-0 kubenswrapper[15202]: I0319 09:45:54.173093 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.252351 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4dc72b1a-a76e-4246-be58-5576544be5a8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5wgm6\" (UID: \"4dc72b1a-a76e-4246-be58-5576544be5a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.255024 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-kube-api-access-mnw6l\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.255141 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-ovs-socket\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.255175 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-dbus-socket\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.255253 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6km4g\" (UniqueName: \"kubernetes.io/projected/4dc72b1a-a76e-4246-be58-5576544be5a8-kube-api-access-6km4g\") pod \"nmstate-webhook-5f558f5558-5wgm6\" (UID: \"4dc72b1a-a76e-4246-be58-5576544be5a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.255366 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-nmstate-lock\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.255565 master-0 kubenswrapper[15202]: I0319 09:45:54.255391 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplq2\" (UniqueName: \"kubernetes.io/projected/e0518d63-cd74-47d9-8d59-bc542409fec0-kube-api-access-hplq2\") pod \"nmstate-metrics-9b8c8685d-cpgt6\" (UID: \"e0518d63-cd74-47d9-8d59-bc542409fec0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" Mar 19 09:45:54.261867 master-0 kubenswrapper[15202]: I0319 09:45:54.261808 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4dc72b1a-a76e-4246-be58-5576544be5a8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5wgm6\" (UID: \"4dc72b1a-a76e-4246-be58-5576544be5a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.271040 master-0 kubenswrapper[15202]: I0319 09:45:54.270957 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-jkh97" podStartSLOduration=2.009665828 podStartE2EDuration="3.270936004s" podCreationTimestamp="2026-03-19 09:45:51 +0000 UTC" firstStartedPulling="2026-03-19 09:45:52.602148593 +0000 UTC m=+1269.987563439" lastFinishedPulling="2026-03-19 09:45:53.863418799 +0000 UTC m=+1271.248833615" observedRunningTime="2026-03-19 09:45:54.236009594 +0000 UTC m=+1271.621424410" watchObservedRunningTime="2026-03-19 09:45:54.270936004 +0000 UTC m=+1271.656350820" Mar 19 09:45:54.283736 master-0 kubenswrapper[15202]: I0319 09:45:54.283701 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6km4g\" (UniqueName: \"kubernetes.io/projected/4dc72b1a-a76e-4246-be58-5576544be5a8-kube-api-access-6km4g\") pod \"nmstate-webhook-5f558f5558-5wgm6\" (UID: \"4dc72b1a-a76e-4246-be58-5576544be5a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.290319 master-0 kubenswrapper[15202]: I0319 09:45:54.290265 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplq2\" (UniqueName: \"kubernetes.io/projected/e0518d63-cd74-47d9-8d59-bc542409fec0-kube-api-access-hplq2\") pod \"nmstate-metrics-9b8c8685d-cpgt6\" (UID: \"e0518d63-cd74-47d9-8d59-bc542409fec0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" Mar 19 09:45:54.329659 master-0 kubenswrapper[15202]: I0319 09:45:54.329567 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc"] Mar 19 09:45:54.330919 master-0 kubenswrapper[15202]: I0319 09:45:54.330803 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.335590 master-0 kubenswrapper[15202]: I0319 09:45:54.335527 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 09:45:54.335704 master-0 kubenswrapper[15202]: I0319 09:45:54.335608 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 09:45:54.343629 master-0 kubenswrapper[15202]: I0319 09:45:54.343070 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc"] Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.358365 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-nmstate-lock\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.358445 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-kube-api-access-mnw6l\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.358534 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-ovs-socket\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.358565 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-dbus-socket\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.359087 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-dbus-socket\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.359149 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-nmstate-lock\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.360792 master-0 kubenswrapper[15202]: I0319 09:45:54.359401 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-ovs-socket\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.384904 master-0 kubenswrapper[15202]: I0319 09:45:54.384804 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnw6l\" (UniqueName: \"kubernetes.io/projected/f9e7d1ea-3a5e-460c-8b32-5687e773d19d-kube-api-access-mnw6l\") pod \"nmstate-handler-gns5r\" (UID: \"f9e7d1ea-3a5e-460c-8b32-5687e773d19d\") " pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.449159 master-0 kubenswrapper[15202]: I0319 09:45:54.447553 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" Mar 19 09:45:54.461311 master-0 kubenswrapper[15202]: I0319 09:45:54.461090 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:45:54.461579 master-0 kubenswrapper[15202]: I0319 09:45:54.461521 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4c02ef0-564a-4eec-8979-6e4a764bfddc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.461653 master-0 kubenswrapper[15202]: I0319 09:45:54.461625 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnc25\" (UniqueName: \"kubernetes.io/projected/a4c02ef0-564a-4eec-8979-6e4a764bfddc-kube-api-access-rnc25\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.461732 master-0 kubenswrapper[15202]: I0319 09:45:54.461697 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c02ef0-564a-4eec-8979-6e4a764bfddc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.512124 master-0 kubenswrapper[15202]: I0319 09:45:54.511537 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:45:54.576941 master-0 kubenswrapper[15202]: I0319 09:45:54.572494 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4c02ef0-564a-4eec-8979-6e4a764bfddc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.576941 master-0 kubenswrapper[15202]: I0319 09:45:54.572614 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnc25\" (UniqueName: \"kubernetes.io/projected/a4c02ef0-564a-4eec-8979-6e4a764bfddc-kube-api-access-rnc25\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.576941 master-0 kubenswrapper[15202]: I0319 09:45:54.572693 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c02ef0-564a-4eec-8979-6e4a764bfddc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.576941 master-0 kubenswrapper[15202]: I0319 09:45:54.574582 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fdb5b65cd-fdkqt"] Mar 19 09:45:54.576941 master-0 kubenswrapper[15202]: I0319 09:45:54.575492 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4c02ef0-564a-4eec-8979-6e4a764bfddc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.577253 master-0 kubenswrapper[15202]: I0319 09:45:54.577097 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c02ef0-564a-4eec-8979-6e4a764bfddc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.581683 master-0 kubenswrapper[15202]: I0319 09:45:54.579587 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.582389 master-0 kubenswrapper[15202]: I0319 09:45:54.581877 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-pzxns" Mar 19 09:45:54.611776 master-0 kubenswrapper[15202]: I0319 09:45:54.610923 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdb5b65cd-fdkqt"] Mar 19 09:45:54.620546 master-0 kubenswrapper[15202]: I0319 09:45:54.617656 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnc25\" (UniqueName: \"kubernetes.io/projected/a4c02ef0-564a-4eec-8979-6e4a764bfddc-kube-api-access-rnc25\") pod \"nmstate-console-plugin-86f58fcf4-dlgsc\" (UID: \"a4c02ef0-564a-4eec-8979-6e4a764bfddc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674149 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-trusted-ca-bundle\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674245 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvcv\" (UniqueName: \"kubernetes.io/projected/64b2a748-e1c7-458d-9287-ab369cd3f056-kube-api-access-wrvcv\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674279 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64b2a748-e1c7-458d-9287-ab369cd3f056-console-serving-cert\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674321 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64b2a748-e1c7-458d-9287-ab369cd3f056-console-oauth-config\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674388 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-console-config\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674488 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-oauth-serving-cert\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.676437 master-0 kubenswrapper[15202]: I0319 09:45:54.674613 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-service-ca\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.693264 master-0 kubenswrapper[15202]: I0319 09:45:54.689458 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" Mar 19 09:45:54.783563 master-0 kubenswrapper[15202]: I0319 09:45:54.782870 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-trusted-ca-bundle\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.783563 master-0 kubenswrapper[15202]: I0319 09:45:54.783098 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvcv\" (UniqueName: \"kubernetes.io/projected/64b2a748-e1c7-458d-9287-ab369cd3f056-kube-api-access-wrvcv\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.783563 master-0 kubenswrapper[15202]: I0319 09:45:54.783134 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64b2a748-e1c7-458d-9287-ab369cd3f056-console-serving-cert\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.783563 master-0 kubenswrapper[15202]: I0319 09:45:54.783289 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64b2a748-e1c7-458d-9287-ab369cd3f056-console-oauth-config\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.783885 master-0 kubenswrapper[15202]: I0319 09:45:54.783705 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-console-config\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.783885 master-0 kubenswrapper[15202]: I0319 09:45:54.783773 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-oauth-serving-cert\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.783885 master-0 kubenswrapper[15202]: I0319 09:45:54.783874 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-service-ca\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.785419 master-0 kubenswrapper[15202]: I0319 09:45:54.785383 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-service-ca\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.788729 master-0 kubenswrapper[15202]: I0319 09:45:54.787130 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-trusted-ca-bundle\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.788729 master-0 kubenswrapper[15202]: I0319 09:45:54.787925 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-console-config\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.792712 master-0 kubenswrapper[15202]: I0319 09:45:54.790133 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64b2a748-e1c7-458d-9287-ab369cd3f056-console-oauth-config\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.793331 master-0 kubenswrapper[15202]: I0319 09:45:54.793243 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64b2a748-e1c7-458d-9287-ab369cd3f056-oauth-serving-cert\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.813135 master-0 kubenswrapper[15202]: I0319 09:45:54.813082 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvcv\" (UniqueName: \"kubernetes.io/projected/64b2a748-e1c7-458d-9287-ab369cd3f056-kube-api-access-wrvcv\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.831951 master-0 kubenswrapper[15202]: I0319 09:45:54.831386 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64b2a748-e1c7-458d-9287-ab369cd3f056-console-serving-cert\") pod \"console-5fdb5b65cd-fdkqt\" (UID: \"64b2a748-e1c7-458d-9287-ab369cd3f056\") " pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:54.917006 master-0 kubenswrapper[15202]: I0319 09:45:54.916621 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:45:55.165596 master-0 kubenswrapper[15202]: I0319 09:45:55.165415 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gns5r" event={"ID":"f9e7d1ea-3a5e-460c-8b32-5687e773d19d","Type":"ContainerStarted","Data":"179df5371d598eb93e101c4cc0260ec936591b4a3da788b28c47b47bd9be9efb"} Mar 19 09:45:55.643950 master-0 kubenswrapper[15202]: I0319 09:45:55.643892 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6"] Mar 19 09:45:55.651620 master-0 kubenswrapper[15202]: W0319 09:45:55.651067 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b2a748_e1c7_458d_9287_ab369cd3f056.slice/crio-233da884d054c6810ab8fbd8e9a9af01a4a030c51f3fbe13d63caf1e2a85cee2 WatchSource:0}: Error finding container 233da884d054c6810ab8fbd8e9a9af01a4a030c51f3fbe13d63caf1e2a85cee2: Status 404 returned error can't find the container with id 233da884d054c6810ab8fbd8e9a9af01a4a030c51f3fbe13d63caf1e2a85cee2 Mar 19 09:45:55.654059 master-0 kubenswrapper[15202]: W0319 09:45:55.654006 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc72b1a_a76e_4246_be58_5576544be5a8.slice/crio-30451cad9f0e9407e301c4fb3964f09025b4a7d850b1e56eb7ffaf4478706c38 WatchSource:0}: Error finding container 30451cad9f0e9407e301c4fb3964f09025b4a7d850b1e56eb7ffaf4478706c38: Status 404 returned error can't find the container with id 30451cad9f0e9407e301c4fb3964f09025b4a7d850b1e56eb7ffaf4478706c38 Mar 19 09:45:55.662143 master-0 kubenswrapper[15202]: I0319 09:45:55.662087 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6"] Mar 19 09:45:55.673401 master-0 kubenswrapper[15202]: I0319 09:45:55.673352 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdb5b65cd-fdkqt"] Mar 19 09:45:55.688254 master-0 kubenswrapper[15202]: I0319 09:45:55.687443 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc"] Mar 19 09:45:56.181141 master-0 kubenswrapper[15202]: I0319 09:45:56.181067 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" event={"ID":"a4c02ef0-564a-4eec-8979-6e4a764bfddc","Type":"ContainerStarted","Data":"c0a9205f3cea9e79b5777b1177d76d8dc26ad2766e4984dd94669757b00d48dc"} Mar 19 09:45:56.184776 master-0 kubenswrapper[15202]: I0319 09:45:56.183509 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jkzd2" event={"ID":"f2a48cff-3780-4bd2-b12c-e6b77a990d8b","Type":"ContainerStarted","Data":"fbf7a87d64fb6baf973c2978a2dec12f10c3ff8ca284b1760021e9abef90790d"} Mar 19 09:45:56.184776 master-0 kubenswrapper[15202]: I0319 09:45:56.183760 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jkzd2" Mar 19 09:45:56.185433 master-0 kubenswrapper[15202]: I0319 09:45:56.185370 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" event={"ID":"e0518d63-cd74-47d9-8d59-bc542409fec0","Type":"ContainerStarted","Data":"2a6ce7984c3b2fef2f8370062dc271661e839a576dd71bc59c94a502574c9401"} Mar 19 09:45:56.216631 master-0 kubenswrapper[15202]: I0319 09:45:56.216183 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdb5b65cd-fdkqt" event={"ID":"64b2a748-e1c7-458d-9287-ab369cd3f056","Type":"ContainerStarted","Data":"258ebe9055cd22b7b766b6fe830e5c6c148c4beb5f4fab80d17de7dc20978846"} Mar 19 09:45:56.216631 master-0 kubenswrapper[15202]: I0319 09:45:56.216250 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdb5b65cd-fdkqt" event={"ID":"64b2a748-e1c7-458d-9287-ab369cd3f056","Type":"ContainerStarted","Data":"233da884d054c6810ab8fbd8e9a9af01a4a030c51f3fbe13d63caf1e2a85cee2"} Mar 19 09:45:56.242949 master-0 kubenswrapper[15202]: I0319 09:45:56.237862 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" event={"ID":"4dc72b1a-a76e-4246-be58-5576544be5a8","Type":"ContainerStarted","Data":"30451cad9f0e9407e301c4fb3964f09025b4a7d850b1e56eb7ffaf4478706c38"} Mar 19 09:45:56.250260 master-0 kubenswrapper[15202]: I0319 09:45:56.250133 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jkzd2" podStartSLOduration=3.821578062 podStartE2EDuration="5.250103328s" podCreationTimestamp="2026-03-19 09:45:51 +0000 UTC" firstStartedPulling="2026-03-19 09:45:53.314955024 +0000 UTC m=+1270.700369840" lastFinishedPulling="2026-03-19 09:45:54.74348029 +0000 UTC m=+1272.128895106" observedRunningTime="2026-03-19 09:45:56.231823788 +0000 UTC m=+1273.617238604" watchObservedRunningTime="2026-03-19 09:45:56.250103328 +0000 UTC m=+1273.635518144" Mar 19 09:45:56.309756 master-0 kubenswrapper[15202]: I0319 09:45:56.307181 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fdb5b65cd-fdkqt" podStartSLOduration=2.307151842 podStartE2EDuration="2.307151842s" podCreationTimestamp="2026-03-19 09:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:45:56.276112648 +0000 UTC m=+1273.661527464" watchObservedRunningTime="2026-03-19 09:45:56.307151842 +0000 UTC m=+1273.692566658" Mar 19 09:46:01.295835 master-0 kubenswrapper[15202]: I0319 09:46:01.294076 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" event={"ID":"fb00ada9-e047-47a7-82b0-44a3a66d6669","Type":"ContainerStarted","Data":"4cf6276428dbd932ac8956e99fb1196b945951c286d148b274da750590b72cfa"} Mar 19 09:46:01.295835 master-0 kubenswrapper[15202]: I0319 09:46:01.294220 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:46:01.297107 master-0 kubenswrapper[15202]: I0319 09:46:01.296766 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gns5r" event={"ID":"f9e7d1ea-3a5e-460c-8b32-5687e773d19d","Type":"ContainerStarted","Data":"b97fc40d38cde1d0c4d260b3faaedf754ae5abecf3624da0ac03c8841f9fcdf8"} Mar 19 09:46:01.297410 master-0 kubenswrapper[15202]: I0319 09:46:01.297296 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:46:01.310151 master-0 kubenswrapper[15202]: I0319 09:46:01.310103 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" event={"ID":"e0518d63-cd74-47d9-8d59-bc542409fec0","Type":"ContainerStarted","Data":"4b72bba82e3c7ca7cb5e491c444f2229726dc6b0cf75020d13145ad1101b2bf1"} Mar 19 09:46:01.310350 master-0 kubenswrapper[15202]: I0319 09:46:01.310158 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" event={"ID":"e0518d63-cd74-47d9-8d59-bc542409fec0","Type":"ContainerStarted","Data":"4bd96eb20a87cca6ba938c69d4cd09d85e8978e33644e5e91c4c88928e29f48b"} Mar 19 09:46:01.312182 master-0 kubenswrapper[15202]: I0319 09:46:01.312135 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" event={"ID":"4dc72b1a-a76e-4246-be58-5576544be5a8","Type":"ContainerStarted","Data":"fa13f61a42f3f680fb15545bcc7da4fd98eddea00450755b935ad59e97256689"} Mar 19 09:46:01.312488 master-0 kubenswrapper[15202]: I0319 09:46:01.312350 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:46:01.316070 master-0 kubenswrapper[15202]: I0319 09:46:01.316019 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" event={"ID":"a4c02ef0-564a-4eec-8979-6e4a764bfddc","Type":"ContainerStarted","Data":"af3177df691236a3250c3608f50aeb81854d6f26347606e4e298bbae3b0e1988"} Mar 19 09:46:01.326742 master-0 kubenswrapper[15202]: I0319 09:46:01.326655 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" podStartSLOduration=2.354477473 podStartE2EDuration="11.326630078s" podCreationTimestamp="2026-03-19 09:45:50 +0000 UTC" firstStartedPulling="2026-03-19 09:45:51.806702576 +0000 UTC m=+1269.192117392" lastFinishedPulling="2026-03-19 09:46:00.778855191 +0000 UTC m=+1278.164269997" observedRunningTime="2026-03-19 09:46:01.317272219 +0000 UTC m=+1278.702687055" watchObservedRunningTime="2026-03-19 09:46:01.326630078 +0000 UTC m=+1278.712044904" Mar 19 09:46:01.329045 master-0 kubenswrapper[15202]: I0319 09:46:01.329006 15202 generic.go:334] "Generic (PLEG): container finished" podID="b9d34e98-54a4-4e3b-ae50-92832b3dce0b" containerID="484adb7597eaab8402646fecf54f3b180047d7dc16fb9b855a2efc424954e4e6" exitCode=0 Mar 19 09:46:01.329171 master-0 kubenswrapper[15202]: I0319 09:46:01.329063 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerDied","Data":"484adb7597eaab8402646fecf54f3b180047d7dc16fb9b855a2efc424954e4e6"} Mar 19 09:46:01.344271 master-0 kubenswrapper[15202]: I0319 09:46:01.344147 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" podStartSLOduration=2.232927736 podStartE2EDuration="7.34411822s" podCreationTimestamp="2026-03-19 09:45:54 +0000 UTC" firstStartedPulling="2026-03-19 09:45:55.66738366 +0000 UTC m=+1273.052798476" lastFinishedPulling="2026-03-19 09:46:00.778574134 +0000 UTC m=+1278.163988960" observedRunningTime="2026-03-19 09:46:01.335518498 +0000 UTC m=+1278.720933324" watchObservedRunningTime="2026-03-19 09:46:01.34411822 +0000 UTC m=+1278.729533036" Mar 19 09:46:01.395021 master-0 kubenswrapper[15202]: I0319 09:46:01.394681 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gns5r" podStartSLOduration=1.287466144 podStartE2EDuration="7.394660164s" podCreationTimestamp="2026-03-19 09:45:54 +0000 UTC" firstStartedPulling="2026-03-19 09:45:54.670677737 +0000 UTC m=+1272.056092543" lastFinishedPulling="2026-03-19 09:46:00.777871747 +0000 UTC m=+1278.163286563" observedRunningTime="2026-03-19 09:46:01.387943308 +0000 UTC m=+1278.773358124" watchObservedRunningTime="2026-03-19 09:46:01.394660164 +0000 UTC m=+1278.780074980" Mar 19 09:46:01.436885 master-0 kubenswrapper[15202]: I0319 09:46:01.436429 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlgsc" podStartSLOduration=2.336677489 podStartE2EDuration="7.436406662s" podCreationTimestamp="2026-03-19 09:45:54 +0000 UTC" firstStartedPulling="2026-03-19 09:45:55.678277847 +0000 UTC m=+1273.063692663" lastFinishedPulling="2026-03-19 09:46:00.77800702 +0000 UTC m=+1278.163421836" observedRunningTime="2026-03-19 09:46:01.430386534 +0000 UTC m=+1278.815801360" watchObservedRunningTime="2026-03-19 09:46:01.436406662 +0000 UTC m=+1278.821821478" Mar 19 09:46:01.464301 master-0 kubenswrapper[15202]: I0319 09:46:01.463827 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cpgt6" podStartSLOduration=2.34078521 podStartE2EDuration="7.463803506s" podCreationTimestamp="2026-03-19 09:45:54 +0000 UTC" firstStartedPulling="2026-03-19 09:45:55.654862631 +0000 UTC m=+1273.040277447" lastFinishedPulling="2026-03-19 09:46:00.777880927 +0000 UTC m=+1278.163295743" observedRunningTime="2026-03-19 09:46:01.451927304 +0000 UTC m=+1278.837342120" watchObservedRunningTime="2026-03-19 09:46:01.463803506 +0000 UTC m=+1278.849218322" Mar 19 09:46:02.355617 master-0 kubenswrapper[15202]: I0319 09:46:02.355557 15202 generic.go:334] "Generic (PLEG): container finished" podID="b9d34e98-54a4-4e3b-ae50-92832b3dce0b" containerID="5c4aabefcc4ef27b092db0200a41886d03f1ccf7c4ed2987d2a182b005898ad9" exitCode=0 Mar 19 09:46:02.356429 master-0 kubenswrapper[15202]: I0319 09:46:02.355703 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerDied","Data":"5c4aabefcc4ef27b092db0200a41886d03f1ccf7c4ed2987d2a182b005898ad9"} Mar 19 09:46:03.370315 master-0 kubenswrapper[15202]: I0319 09:46:03.370220 15202 generic.go:334] "Generic (PLEG): container finished" podID="b9d34e98-54a4-4e3b-ae50-92832b3dce0b" containerID="c6086e45ac1bf09f3f355a7576faecf3d91b96ba66cf90f8db851cefe16723b1" exitCode=0 Mar 19 09:46:03.371571 master-0 kubenswrapper[15202]: I0319 09:46:03.371495 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerDied","Data":"c6086e45ac1bf09f3f355a7576faecf3d91b96ba66cf90f8db851cefe16723b1"} Mar 19 09:46:04.397681 master-0 kubenswrapper[15202]: I0319 09:46:04.397626 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"ef567322a60628ff80d679591593648ce32c03a112027de4e2e0fe333ed52eb3"} Mar 19 09:46:04.397681 master-0 kubenswrapper[15202]: I0319 09:46:04.397683 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"2bd0143f08969239dd918f6ee4f07b8250941530d8470ac0a2b186e7ea3b8f78"} Mar 19 09:46:04.398212 master-0 kubenswrapper[15202]: I0319 09:46:04.397698 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"8c2c45091490d7dc166e361b0d789cd2826ab7f4857b253f6db1c172e5a5e93e"} Mar 19 09:46:04.398212 master-0 kubenswrapper[15202]: I0319 09:46:04.397709 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"7583a1e8408ff1d7242f7a8e3418fd2729ac731b220730ced25987e7e2db5975"} Mar 19 09:46:04.398212 master-0 kubenswrapper[15202]: I0319 09:46:04.397719 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"0676bb4a68c795f2163b3849dd7952d47444dcf70e8504853248d731c0d94841"} Mar 19 09:46:04.919853 master-0 kubenswrapper[15202]: I0319 09:46:04.919729 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:46:04.919853 master-0 kubenswrapper[15202]: I0319 09:46:04.919856 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:46:04.926101 master-0 kubenswrapper[15202]: I0319 09:46:04.926010 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:46:05.412035 master-0 kubenswrapper[15202]: I0319 09:46:05.411947 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dttqv" event={"ID":"b9d34e98-54a4-4e3b-ae50-92832b3dce0b","Type":"ContainerStarted","Data":"1be150c49b69ba788bb23f4aeb0aa873737a85c3c5f04340aeffb5eb7be10800"} Mar 19 09:46:05.412968 master-0 kubenswrapper[15202]: I0319 09:46:05.412912 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dttqv" Mar 19 09:46:05.416384 master-0 kubenswrapper[15202]: I0319 09:46:05.416338 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fdb5b65cd-fdkqt" Mar 19 09:46:05.440263 master-0 kubenswrapper[15202]: I0319 09:46:05.440182 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dttqv" podStartSLOduration=6.6638421359999995 podStartE2EDuration="15.440164808s" podCreationTimestamp="2026-03-19 09:45:50 +0000 UTC" firstStartedPulling="2026-03-19 09:45:52.050494319 +0000 UTC m=+1269.435909135" lastFinishedPulling="2026-03-19 09:46:00.826816991 +0000 UTC m=+1278.212231807" observedRunningTime="2026-03-19 09:46:05.438700162 +0000 UTC m=+1282.824114988" watchObservedRunningTime="2026-03-19 09:46:05.440164808 +0000 UTC m=+1282.825579624" Mar 19 09:46:05.515413 master-0 kubenswrapper[15202]: I0319 09:46:05.512400 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54cf565479-phtrp"] Mar 19 09:46:06.904053 master-0 kubenswrapper[15202]: I0319 09:46:06.903987 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dttqv" Mar 19 09:46:06.949654 master-0 kubenswrapper[15202]: I0319 09:46:06.949596 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dttqv" Mar 19 09:46:09.556744 master-0 kubenswrapper[15202]: I0319 09:46:09.556659 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gns5r" Mar 19 09:46:11.318060 master-0 kubenswrapper[15202]: I0319 09:46:11.317953 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-sfpc9" Mar 19 09:46:11.831824 master-0 kubenswrapper[15202]: I0319 09:46:11.831770 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-jkh97" Mar 19 09:46:12.969645 master-0 kubenswrapper[15202]: I0319 09:46:12.969563 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jkzd2" Mar 19 09:46:14.471968 master-0 kubenswrapper[15202]: I0319 09:46:14.471881 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5wgm6" Mar 19 09:46:16.514391 master-0 kubenswrapper[15202]: I0319 09:46:16.514333 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-jzfd5"] Mar 19 09:46:16.516629 master-0 kubenswrapper[15202]: I0319 09:46:16.516572 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.520146 master-0 kubenswrapper[15202]: I0319 09:46:16.520106 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 19 09:46:16.542247 master-0 kubenswrapper[15202]: I0319 09:46:16.542173 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-jzfd5"] Mar 19 09:46:16.619425 master-0 kubenswrapper[15202]: I0319 09:46:16.619341 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/134428ef-4e6b-4f37-aec8-20fd5d2591df-metrics-cert\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.619425 master-0 kubenswrapper[15202]: I0319 09:46:16.619434 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-node-plugin-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.619774 master-0 kubenswrapper[15202]: I0319 09:46:16.619511 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-lvmd-config\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.619774 master-0 kubenswrapper[15202]: I0319 09:46:16.619741 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-device-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.619883 master-0 kubenswrapper[15202]: I0319 09:46:16.619829 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ss2c\" (UniqueName: \"kubernetes.io/projected/134428ef-4e6b-4f37-aec8-20fd5d2591df-kube-api-access-5ss2c\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.619949 master-0 kubenswrapper[15202]: I0319 09:46:16.619891 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-registration-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.620057 master-0 kubenswrapper[15202]: I0319 09:46:16.620018 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-csi-plugin-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.620143 master-0 kubenswrapper[15202]: I0319 09:46:16.620071 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-pod-volumes-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.620143 master-0 kubenswrapper[15202]: I0319 09:46:16.620095 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-sys\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.620143 master-0 kubenswrapper[15202]: I0319 09:46:16.620125 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-file-lock-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.620656 master-0 kubenswrapper[15202]: I0319 09:46:16.620177 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-run-udev\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.722762 master-0 kubenswrapper[15202]: I0319 09:46:16.722690 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-registration-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723115 master-0 kubenswrapper[15202]: I0319 09:46:16.722912 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-csi-plugin-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723156 master-0 kubenswrapper[15202]: I0319 09:46:16.723113 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-registration-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723210 master-0 kubenswrapper[15202]: I0319 09:46:16.723189 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-pod-volumes-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723250 master-0 kubenswrapper[15202]: I0319 09:46:16.723216 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-sys\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723289 master-0 kubenswrapper[15202]: I0319 09:46:16.723260 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-file-lock-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723289 master-0 kubenswrapper[15202]: I0319 09:46:16.723283 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-run-udev\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723379 master-0 kubenswrapper[15202]: I0319 09:46:16.723354 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/134428ef-4e6b-4f37-aec8-20fd5d2591df-metrics-cert\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723482 master-0 kubenswrapper[15202]: I0319 09:46:16.723438 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-node-plugin-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723732 master-0 kubenswrapper[15202]: I0319 09:46:16.723687 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-pod-volumes-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723843 master-0 kubenswrapper[15202]: I0319 09:46:16.723713 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-csi-plugin-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723915 master-0 kubenswrapper[15202]: I0319 09:46:16.723713 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-sys\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.723972 master-0 kubenswrapper[15202]: I0319 09:46:16.723869 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-node-plugin-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724030 master-0 kubenswrapper[15202]: I0319 09:46:16.723987 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-run-udev\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724205 master-0 kubenswrapper[15202]: I0319 09:46:16.724085 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-file-lock-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724205 master-0 kubenswrapper[15202]: I0319 09:46:16.724149 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-lvmd-config\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724205 master-0 kubenswrapper[15202]: I0319 09:46:16.724009 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-lvmd-config\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724341 master-0 kubenswrapper[15202]: I0319 09:46:16.724313 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-device-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724416 master-0 kubenswrapper[15202]: I0319 09:46:16.724386 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ss2c\" (UniqueName: \"kubernetes.io/projected/134428ef-4e6b-4f37-aec8-20fd5d2591df-kube-api-access-5ss2c\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.724538 master-0 kubenswrapper[15202]: I0319 09:46:16.724517 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/134428ef-4e6b-4f37-aec8-20fd5d2591df-device-dir\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.726957 master-0 kubenswrapper[15202]: I0319 09:46:16.726919 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/134428ef-4e6b-4f37-aec8-20fd5d2591df-metrics-cert\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.857587 master-0 kubenswrapper[15202]: I0319 09:46:16.857338 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ss2c\" (UniqueName: \"kubernetes.io/projected/134428ef-4e6b-4f37-aec8-20fd5d2591df-kube-api-access-5ss2c\") pod \"vg-manager-jzfd5\" (UID: \"134428ef-4e6b-4f37-aec8-20fd5d2591df\") " pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:16.887700 master-0 kubenswrapper[15202]: I0319 09:46:16.887594 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:17.800174 master-0 kubenswrapper[15202]: W0319 09:46:17.799883 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod134428ef_4e6b_4f37_aec8_20fd5d2591df.slice/crio-7b2ad944e9b6f13d1b73f4699be420c67c54ca839f61cfdc80e0519a91c55363 WatchSource:0}: Error finding container 7b2ad944e9b6f13d1b73f4699be420c67c54ca839f61cfdc80e0519a91c55363: Status 404 returned error can't find the container with id 7b2ad944e9b6f13d1b73f4699be420c67c54ca839f61cfdc80e0519a91c55363 Mar 19 09:46:17.803300 master-0 kubenswrapper[15202]: I0319 09:46:17.802819 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-jzfd5"] Mar 19 09:46:18.565014 master-0 kubenswrapper[15202]: I0319 09:46:18.564946 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-jzfd5" event={"ID":"134428ef-4e6b-4f37-aec8-20fd5d2591df","Type":"ContainerStarted","Data":"9ac3fb5b90f19601dc2ac061ad45e8ea86b017337ed5ed2661caff2eb43db490"} Mar 19 09:46:18.565014 master-0 kubenswrapper[15202]: I0319 09:46:18.565004 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-jzfd5" event={"ID":"134428ef-4e6b-4f37-aec8-20fd5d2591df","Type":"ContainerStarted","Data":"7b2ad944e9b6f13d1b73f4699be420c67c54ca839f61cfdc80e0519a91c55363"} Mar 19 09:46:18.602369 master-0 kubenswrapper[15202]: I0319 09:46:18.602254 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-jzfd5" podStartSLOduration=2.602233833 podStartE2EDuration="2.602233833s" podCreationTimestamp="2026-03-19 09:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:18.599314511 +0000 UTC m=+1295.984729327" watchObservedRunningTime="2026-03-19 09:46:18.602233833 +0000 UTC m=+1295.987648649" Mar 19 09:46:20.583030 master-0 kubenswrapper[15202]: I0319 09:46:20.582915 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-jzfd5_134428ef-4e6b-4f37-aec8-20fd5d2591df/vg-manager/0.log" Mar 19 09:46:20.583030 master-0 kubenswrapper[15202]: I0319 09:46:20.582976 15202 generic.go:334] "Generic (PLEG): container finished" podID="134428ef-4e6b-4f37-aec8-20fd5d2591df" containerID="9ac3fb5b90f19601dc2ac061ad45e8ea86b017337ed5ed2661caff2eb43db490" exitCode=1 Mar 19 09:46:20.583787 master-0 kubenswrapper[15202]: I0319 09:46:20.583047 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-jzfd5" event={"ID":"134428ef-4e6b-4f37-aec8-20fd5d2591df","Type":"ContainerDied","Data":"9ac3fb5b90f19601dc2ac061ad45e8ea86b017337ed5ed2661caff2eb43db490"} Mar 19 09:46:20.584910 master-0 kubenswrapper[15202]: I0319 09:46:20.584865 15202 scope.go:117] "RemoveContainer" containerID="9ac3fb5b90f19601dc2ac061ad45e8ea86b017337ed5ed2661caff2eb43db490" Mar 19 09:46:20.925070 master-0 kubenswrapper[15202]: I0319 09:46:20.924667 15202 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 19 09:46:20.962508 master-0 kubenswrapper[15202]: I0319 09:46:20.962225 15202 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-19T09:46:20.924740522Z","Handler":null,"Name":""} Mar 19 09:46:20.966322 master-0 kubenswrapper[15202]: I0319 09:46:20.966253 15202 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 19 09:46:20.966322 master-0 kubenswrapper[15202]: I0319 09:46:20.966327 15202 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 19 09:46:21.595626 master-0 kubenswrapper[15202]: I0319 09:46:21.595574 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-jzfd5_134428ef-4e6b-4f37-aec8-20fd5d2591df/vg-manager/0.log" Mar 19 09:46:21.596265 master-0 kubenswrapper[15202]: I0319 09:46:21.595669 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-jzfd5" event={"ID":"134428ef-4e6b-4f37-aec8-20fd5d2591df","Type":"ContainerStarted","Data":"cc8002000f66866d17df0926842df35f86867c908835ade74ba9ee094e762332"} Mar 19 09:46:21.899715 master-0 kubenswrapper[15202]: I0319 09:46:21.899571 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dttqv" Mar 19 09:46:23.632283 master-0 kubenswrapper[15202]: I0319 09:46:23.632198 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k889w"] Mar 19 09:46:23.633921 master-0 kubenswrapper[15202]: I0319 09:46:23.633887 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:23.638316 master-0 kubenswrapper[15202]: I0319 09:46:23.638270 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 09:46:23.638540 master-0 kubenswrapper[15202]: I0319 09:46:23.638519 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 09:46:23.680494 master-0 kubenswrapper[15202]: I0319 09:46:23.679074 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k889w"] Mar 19 09:46:23.807491 master-0 kubenswrapper[15202]: I0319 09:46:23.807394 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhf2l\" (UniqueName: \"kubernetes.io/projected/9716422d-5974-4db4-84b6-e7d2b1c38244-kube-api-access-xhf2l\") pod \"openstack-operator-index-k889w\" (UID: \"9716422d-5974-4db4-84b6-e7d2b1c38244\") " pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:23.908907 master-0 kubenswrapper[15202]: I0319 09:46:23.908756 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhf2l\" (UniqueName: \"kubernetes.io/projected/9716422d-5974-4db4-84b6-e7d2b1c38244-kube-api-access-xhf2l\") pod \"openstack-operator-index-k889w\" (UID: \"9716422d-5974-4db4-84b6-e7d2b1c38244\") " pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:23.924234 master-0 kubenswrapper[15202]: I0319 09:46:23.924186 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhf2l\" (UniqueName: \"kubernetes.io/projected/9716422d-5974-4db4-84b6-e7d2b1c38244-kube-api-access-xhf2l\") pod \"openstack-operator-index-k889w\" (UID: \"9716422d-5974-4db4-84b6-e7d2b1c38244\") " pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:23.952314 master-0 kubenswrapper[15202]: I0319 09:46:23.952257 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:24.468670 master-0 kubenswrapper[15202]: I0319 09:46:24.468600 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k889w"] Mar 19 09:46:24.472037 master-0 kubenswrapper[15202]: W0319 09:46:24.472005 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9716422d_5974_4db4_84b6_e7d2b1c38244.slice/crio-a23f3881a175461fd53bd5bed33cd715f605172df8c0df98b794ca76e89bc8e5 WatchSource:0}: Error finding container a23f3881a175461fd53bd5bed33cd715f605172df8c0df98b794ca76e89bc8e5: Status 404 returned error can't find the container with id a23f3881a175461fd53bd5bed33cd715f605172df8c0df98b794ca76e89bc8e5 Mar 19 09:46:24.627110 master-0 kubenswrapper[15202]: I0319 09:46:24.627026 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k889w" event={"ID":"9716422d-5974-4db4-84b6-e7d2b1c38244","Type":"ContainerStarted","Data":"a23f3881a175461fd53bd5bed33cd715f605172df8c0df98b794ca76e89bc8e5"} Mar 19 09:46:26.655546 master-0 kubenswrapper[15202]: I0319 09:46:26.655383 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k889w" event={"ID":"9716422d-5974-4db4-84b6-e7d2b1c38244","Type":"ContainerStarted","Data":"b5649d1640cff4c9e522666b1b9038f9d758d5aa11ff897194f1ce94a6858ab7"} Mar 19 09:46:26.684281 master-0 kubenswrapper[15202]: I0319 09:46:26.684136 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k889w" podStartSLOduration=2.691010374 podStartE2EDuration="3.684109207s" podCreationTimestamp="2026-03-19 09:46:23 +0000 UTC" firstStartedPulling="2026-03-19 09:46:24.474837267 +0000 UTC m=+1301.860252083" lastFinishedPulling="2026-03-19 09:46:25.46793609 +0000 UTC m=+1302.853350916" observedRunningTime="2026-03-19 09:46:26.677883964 +0000 UTC m=+1304.063298780" watchObservedRunningTime="2026-03-19 09:46:26.684109207 +0000 UTC m=+1304.069524023" Mar 19 09:46:26.889091 master-0 kubenswrapper[15202]: I0319 09:46:26.888998 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:26.893296 master-0 kubenswrapper[15202]: I0319 09:46:26.893214 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:27.670875 master-0 kubenswrapper[15202]: I0319 09:46:27.670743 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:27.672305 master-0 kubenswrapper[15202]: I0319 09:46:27.672205 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-jzfd5" Mar 19 09:46:30.580008 master-0 kubenswrapper[15202]: I0319 09:46:30.579911 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-54cf565479-phtrp" podUID="00e8ef85-6a94-43c0-bc66-d23d4094eb8a" containerName="console" containerID="cri-o://b3efc8bceb0cac8b0e654e7e6b0770723ce3fb21a12b990155ed74274670b830" gracePeriod=15 Mar 19 09:46:30.714158 master-0 kubenswrapper[15202]: I0319 09:46:30.714108 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54cf565479-phtrp_00e8ef85-6a94-43c0-bc66-d23d4094eb8a/console/0.log" Mar 19 09:46:30.714304 master-0 kubenswrapper[15202]: I0319 09:46:30.714171 15202 generic.go:334] "Generic (PLEG): container finished" podID="00e8ef85-6a94-43c0-bc66-d23d4094eb8a" containerID="b3efc8bceb0cac8b0e654e7e6b0770723ce3fb21a12b990155ed74274670b830" exitCode=2 Mar 19 09:46:30.714304 master-0 kubenswrapper[15202]: I0319 09:46:30.714206 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54cf565479-phtrp" event={"ID":"00e8ef85-6a94-43c0-bc66-d23d4094eb8a","Type":"ContainerDied","Data":"b3efc8bceb0cac8b0e654e7e6b0770723ce3fb21a12b990155ed74274670b830"} Mar 19 09:46:31.116177 master-0 kubenswrapper[15202]: I0319 09:46:31.116108 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54cf565479-phtrp_00e8ef85-6a94-43c0-bc66-d23d4094eb8a/console/0.log" Mar 19 09:46:31.116389 master-0 kubenswrapper[15202]: I0319 09:46:31.116250 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:46:31.270255 master-0 kubenswrapper[15202]: I0319 09:46:31.270116 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-config\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.270255 master-0 kubenswrapper[15202]: I0319 09:46:31.270231 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-trusted-ca-bundle\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.271151 master-0 kubenswrapper[15202]: I0319 09:46:31.271092 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31.271240 master-0 kubenswrapper[15202]: I0319 09:46:31.271203 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-serving-cert\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.271308 master-0 kubenswrapper[15202]: I0319 09:46:31.271233 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-config" (OuterVolumeSpecName: "console-config") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31.271380 master-0 kubenswrapper[15202]: I0319 09:46:31.271312 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbxf\" (UniqueName: \"kubernetes.io/projected/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-kube-api-access-jcbxf\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.272555 master-0 kubenswrapper[15202]: I0319 09:46:31.272398 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-oauth-config\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.272670 master-0 kubenswrapper[15202]: I0319 09:46:31.272594 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-oauth-serving-cert\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.272763 master-0 kubenswrapper[15202]: I0319 09:46:31.272729 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-service-ca\") pod \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\" (UID: \"00e8ef85-6a94-43c0-bc66-d23d4094eb8a\") " Mar 19 09:46:31.273322 master-0 kubenswrapper[15202]: I0319 09:46:31.273181 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-service-ca" (OuterVolumeSpecName: "service-ca") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31.273403 master-0 kubenswrapper[15202]: I0319 09:46:31.273371 15202 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.273453 master-0 kubenswrapper[15202]: I0319 09:46:31.273397 15202 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.273588 master-0 kubenswrapper[15202]: I0319 09:46:31.273557 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31.274544 master-0 kubenswrapper[15202]: I0319 09:46:31.274503 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-kube-api-access-jcbxf" (OuterVolumeSpecName: "kube-api-access-jcbxf") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "kube-api-access-jcbxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:31.276520 master-0 kubenswrapper[15202]: I0319 09:46:31.276429 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:31.278075 master-0 kubenswrapper[15202]: I0319 09:46:31.277969 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "00e8ef85-6a94-43c0-bc66-d23d4094eb8a" (UID: "00e8ef85-6a94-43c0-bc66-d23d4094eb8a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:31.376370 master-0 kubenswrapper[15202]: I0319 09:46:31.376277 15202 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.376370 master-0 kubenswrapper[15202]: I0319 09:46:31.376336 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcbxf\" (UniqueName: \"kubernetes.io/projected/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-kube-api-access-jcbxf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.376370 master-0 kubenswrapper[15202]: I0319 09:46:31.376352 15202 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.376370 master-0 kubenswrapper[15202]: I0319 09:46:31.376378 15202 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.376789 master-0 kubenswrapper[15202]: I0319 09:46:31.376404 15202 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00e8ef85-6a94-43c0-bc66-d23d4094eb8a-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:31.725428 master-0 kubenswrapper[15202]: I0319 09:46:31.725328 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54cf565479-phtrp_00e8ef85-6a94-43c0-bc66-d23d4094eb8a/console/0.log" Mar 19 09:46:31.725428 master-0 kubenswrapper[15202]: I0319 09:46:31.725423 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54cf565479-phtrp" event={"ID":"00e8ef85-6a94-43c0-bc66-d23d4094eb8a","Type":"ContainerDied","Data":"c8a8ec04d0a38482ac5ca985e52552c04c68d6f99fd83ddb6fb415395b20d70c"} Mar 19 09:46:31.726338 master-0 kubenswrapper[15202]: I0319 09:46:31.725527 15202 scope.go:117] "RemoveContainer" containerID="b3efc8bceb0cac8b0e654e7e6b0770723ce3fb21a12b990155ed74274670b830" Mar 19 09:46:31.726338 master-0 kubenswrapper[15202]: I0319 09:46:31.725830 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54cf565479-phtrp" Mar 19 09:46:31.768104 master-0 kubenswrapper[15202]: I0319 09:46:31.768031 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54cf565479-phtrp"] Mar 19 09:46:31.776337 master-0 kubenswrapper[15202]: I0319 09:46:31.776270 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54cf565479-phtrp"] Mar 19 09:46:32.824711 master-0 kubenswrapper[15202]: I0319 09:46:32.824602 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e8ef85-6a94-43c0-bc66-d23d4094eb8a" path="/var/lib/kubelet/pods/00e8ef85-6a94-43c0-bc66-d23d4094eb8a/volumes" Mar 19 09:46:33.953140 master-0 kubenswrapper[15202]: I0319 09:46:33.953045 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:33.953140 master-0 kubenswrapper[15202]: I0319 09:46:33.953109 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:33.979301 master-0 kubenswrapper[15202]: I0319 09:46:33.979200 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:34.807705 master-0 kubenswrapper[15202]: I0319 09:46:34.807626 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k889w" Mar 19 09:46:40.740353 master-0 kubenswrapper[15202]: I0319 09:46:40.740279 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv"] Mar 19 09:46:40.741016 master-0 kubenswrapper[15202]: E0319 09:46:40.740693 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e8ef85-6a94-43c0-bc66-d23d4094eb8a" containerName="console" Mar 19 09:46:40.741016 master-0 kubenswrapper[15202]: I0319 09:46:40.740707 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e8ef85-6a94-43c0-bc66-d23d4094eb8a" containerName="console" Mar 19 09:46:40.741016 master-0 kubenswrapper[15202]: I0319 09:46:40.740900 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e8ef85-6a94-43c0-bc66-d23d4094eb8a" containerName="console" Mar 19 09:46:40.742077 master-0 kubenswrapper[15202]: I0319 09:46:40.742049 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:40.759309 master-0 kubenswrapper[15202]: I0319 09:46:40.759235 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv"] Mar 19 09:46:40.905503 master-0 kubenswrapper[15202]: I0319 09:46:40.905413 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:40.905748 master-0 kubenswrapper[15202]: I0319 09:46:40.905527 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6cd5\" (UniqueName: \"kubernetes.io/projected/763180d5-9e68-4e72-ad58-157a402e51eb-kube-api-access-v6cd5\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:40.905991 master-0 kubenswrapper[15202]: I0319 09:46:40.905938 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.007955 master-0 kubenswrapper[15202]: I0319 09:46:41.007822 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.007955 master-0 kubenswrapper[15202]: I0319 09:46:41.007908 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.008235 master-0 kubenswrapper[15202]: I0319 09:46:41.007957 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6cd5\" (UniqueName: \"kubernetes.io/projected/763180d5-9e68-4e72-ad58-157a402e51eb-kube-api-access-v6cd5\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.008408 master-0 kubenswrapper[15202]: I0319 09:46:41.008364 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.008823 master-0 kubenswrapper[15202]: I0319 09:46:41.008801 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.025284 master-0 kubenswrapper[15202]: I0319 09:46:41.025231 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6cd5\" (UniqueName: \"kubernetes.io/projected/763180d5-9e68-4e72-ad58-157a402e51eb-kube-api-access-v6cd5\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.059023 master-0 kubenswrapper[15202]: I0319 09:46:41.058953 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:41.532217 master-0 kubenswrapper[15202]: I0319 09:46:41.532138 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv"] Mar 19 09:46:41.538987 master-0 kubenswrapper[15202]: W0319 09:46:41.538929 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763180d5_9e68_4e72_ad58_157a402e51eb.slice/crio-02575caabcf54d9eb5a716b6ef70c952a07e30fab88dc6d9f68d729ec2c02a01 WatchSource:0}: Error finding container 02575caabcf54d9eb5a716b6ef70c952a07e30fab88dc6d9f68d729ec2c02a01: Status 404 returned error can't find the container with id 02575caabcf54d9eb5a716b6ef70c952a07e30fab88dc6d9f68d729ec2c02a01 Mar 19 09:46:41.834756 master-0 kubenswrapper[15202]: I0319 09:46:41.834676 15202 generic.go:334] "Generic (PLEG): container finished" podID="763180d5-9e68-4e72-ad58-157a402e51eb" containerID="8f2a34b674eadbf64417342769f0dab74e9da6a6800cd32afe551588c5a46b2f" exitCode=0 Mar 19 09:46:41.836163 master-0 kubenswrapper[15202]: I0319 09:46:41.834780 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" event={"ID":"763180d5-9e68-4e72-ad58-157a402e51eb","Type":"ContainerDied","Data":"8f2a34b674eadbf64417342769f0dab74e9da6a6800cd32afe551588c5a46b2f"} Mar 19 09:46:41.836163 master-0 kubenswrapper[15202]: I0319 09:46:41.834859 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" event={"ID":"763180d5-9e68-4e72-ad58-157a402e51eb","Type":"ContainerStarted","Data":"02575caabcf54d9eb5a716b6ef70c952a07e30fab88dc6d9f68d729ec2c02a01"} Mar 19 09:46:42.849967 master-0 kubenswrapper[15202]: I0319 09:46:42.849833 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" event={"ID":"763180d5-9e68-4e72-ad58-157a402e51eb","Type":"ContainerDied","Data":"3f9f25faa04d59afa550a9b2ba59c4a8df5915c066ba4fefbb53b301d6e7a5ba"} Mar 19 09:46:42.850751 master-0 kubenswrapper[15202]: I0319 09:46:42.849690 15202 generic.go:334] "Generic (PLEG): container finished" podID="763180d5-9e68-4e72-ad58-157a402e51eb" containerID="3f9f25faa04d59afa550a9b2ba59c4a8df5915c066ba4fefbb53b301d6e7a5ba" exitCode=0 Mar 19 09:46:43.869003 master-0 kubenswrapper[15202]: I0319 09:46:43.868953 15202 generic.go:334] "Generic (PLEG): container finished" podID="763180d5-9e68-4e72-ad58-157a402e51eb" containerID="c1a3554ddc9d9084e3d841a700df19dcc6269c0ec9785db3677383313e18a3a0" exitCode=0 Mar 19 09:46:43.869575 master-0 kubenswrapper[15202]: I0319 09:46:43.869009 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" event={"ID":"763180d5-9e68-4e72-ad58-157a402e51eb","Type":"ContainerDied","Data":"c1a3554ddc9d9084e3d841a700df19dcc6269c0ec9785db3677383313e18a3a0"} Mar 19 09:46:45.265114 master-0 kubenswrapper[15202]: I0319 09:46:45.265050 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:45.292312 master-0 kubenswrapper[15202]: I0319 09:46:45.292219 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-bundle\") pod \"763180d5-9e68-4e72-ad58-157a402e51eb\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " Mar 19 09:46:45.292624 master-0 kubenswrapper[15202]: I0319 09:46:45.292404 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6cd5\" (UniqueName: \"kubernetes.io/projected/763180d5-9e68-4e72-ad58-157a402e51eb-kube-api-access-v6cd5\") pod \"763180d5-9e68-4e72-ad58-157a402e51eb\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " Mar 19 09:46:45.292624 master-0 kubenswrapper[15202]: I0319 09:46:45.292521 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-util\") pod \"763180d5-9e68-4e72-ad58-157a402e51eb\" (UID: \"763180d5-9e68-4e72-ad58-157a402e51eb\") " Mar 19 09:46:45.293105 master-0 kubenswrapper[15202]: I0319 09:46:45.292982 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-bundle" (OuterVolumeSpecName: "bundle") pod "763180d5-9e68-4e72-ad58-157a402e51eb" (UID: "763180d5-9e68-4e72-ad58-157a402e51eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:46:45.293983 master-0 kubenswrapper[15202]: I0319 09:46:45.293927 15202 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:45.303150 master-0 kubenswrapper[15202]: I0319 09:46:45.303097 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-util" (OuterVolumeSpecName: "util") pod "763180d5-9e68-4e72-ad58-157a402e51eb" (UID: "763180d5-9e68-4e72-ad58-157a402e51eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:46:45.337350 master-0 kubenswrapper[15202]: I0319 09:46:45.337268 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763180d5-9e68-4e72-ad58-157a402e51eb-kube-api-access-v6cd5" (OuterVolumeSpecName: "kube-api-access-v6cd5") pod "763180d5-9e68-4e72-ad58-157a402e51eb" (UID: "763180d5-9e68-4e72-ad58-157a402e51eb"). InnerVolumeSpecName "kube-api-access-v6cd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:45.396499 master-0 kubenswrapper[15202]: I0319 09:46:45.395518 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6cd5\" (UniqueName: \"kubernetes.io/projected/763180d5-9e68-4e72-ad58-157a402e51eb-kube-api-access-v6cd5\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:45.396499 master-0 kubenswrapper[15202]: I0319 09:46:45.395560 15202 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/763180d5-9e68-4e72-ad58-157a402e51eb-util\") on node \"master-0\" DevicePath \"\"" Mar 19 09:46:45.888667 master-0 kubenswrapper[15202]: I0319 09:46:45.888596 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" Mar 19 09:46:45.891698 master-0 kubenswrapper[15202]: I0319 09:46:45.891589 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv" event={"ID":"763180d5-9e68-4e72-ad58-157a402e51eb","Type":"ContainerDied","Data":"02575caabcf54d9eb5a716b6ef70c952a07e30fab88dc6d9f68d729ec2c02a01"} Mar 19 09:46:45.891783 master-0 kubenswrapper[15202]: I0319 09:46:45.891728 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02575caabcf54d9eb5a716b6ef70c952a07e30fab88dc6d9f68d729ec2c02a01" Mar 19 09:46:48.446048 master-0 kubenswrapper[15202]: I0319 09:46:48.445984 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5"] Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: E0319 09:46:48.446406 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="pull" Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: I0319 09:46:48.446424 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="pull" Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: E0319 09:46:48.446442 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="extract" Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: I0319 09:46:48.446450 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="extract" Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: E0319 09:46:48.446491 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="util" Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: I0319 09:46:48.446501 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="util" Mar 19 09:46:48.446875 master-0 kubenswrapper[15202]: I0319 09:46:48.446721 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="763180d5-9e68-4e72-ad58-157a402e51eb" containerName="extract" Mar 19 09:46:48.447414 master-0 kubenswrapper[15202]: I0319 09:46:48.447389 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:46:48.462721 master-0 kubenswrapper[15202]: I0319 09:46:48.462679 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5"] Mar 19 09:46:48.550701 master-0 kubenswrapper[15202]: I0319 09:46:48.550623 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25m4d\" (UniqueName: \"kubernetes.io/projected/007cae1b-820f-406e-9039-c4ca50f4ff82-kube-api-access-25m4d\") pod \"openstack-operator-controller-init-b85c4d696-8qpd5\" (UID: \"007cae1b-820f-406e-9039-c4ca50f4ff82\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:46:48.653125 master-0 kubenswrapper[15202]: I0319 09:46:48.653025 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25m4d\" (UniqueName: \"kubernetes.io/projected/007cae1b-820f-406e-9039-c4ca50f4ff82-kube-api-access-25m4d\") pod \"openstack-operator-controller-init-b85c4d696-8qpd5\" (UID: \"007cae1b-820f-406e-9039-c4ca50f4ff82\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:46:48.673872 master-0 kubenswrapper[15202]: I0319 09:46:48.673807 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25m4d\" (UniqueName: \"kubernetes.io/projected/007cae1b-820f-406e-9039-c4ca50f4ff82-kube-api-access-25m4d\") pod \"openstack-operator-controller-init-b85c4d696-8qpd5\" (UID: \"007cae1b-820f-406e-9039-c4ca50f4ff82\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:46:48.769301 master-0 kubenswrapper[15202]: I0319 09:46:48.769097 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:46:49.233826 master-0 kubenswrapper[15202]: I0319 09:46:49.233703 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5"] Mar 19 09:46:49.238937 master-0 kubenswrapper[15202]: W0319 09:46:49.238862 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod007cae1b_820f_406e_9039_c4ca50f4ff82.slice/crio-47d58b49fec47181eef278bc850edfde10d2e21d9345396d0284389631a8da73 WatchSource:0}: Error finding container 47d58b49fec47181eef278bc850edfde10d2e21d9345396d0284389631a8da73: Status 404 returned error can't find the container with id 47d58b49fec47181eef278bc850edfde10d2e21d9345396d0284389631a8da73 Mar 19 09:46:49.934514 master-0 kubenswrapper[15202]: I0319 09:46:49.932785 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" event={"ID":"007cae1b-820f-406e-9039-c4ca50f4ff82","Type":"ContainerStarted","Data":"47d58b49fec47181eef278bc850edfde10d2e21d9345396d0284389631a8da73"} Mar 19 09:46:55.000540 master-0 kubenswrapper[15202]: I0319 09:46:55.000443 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" event={"ID":"007cae1b-820f-406e-9039-c4ca50f4ff82","Type":"ContainerStarted","Data":"728854404a02853e0573fdbec44a2bf816c36cee7651d18ab7585864ded1bb7b"} Mar 19 09:46:55.001078 master-0 kubenswrapper[15202]: I0319 09:46:55.000728 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:46:55.050149 master-0 kubenswrapper[15202]: I0319 09:46:55.050064 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" podStartSLOduration=1.986077779 podStartE2EDuration="7.050040212s" podCreationTimestamp="2026-03-19 09:46:48 +0000 UTC" firstStartedPulling="2026-03-19 09:46:49.248749675 +0000 UTC m=+1326.634164511" lastFinishedPulling="2026-03-19 09:46:54.312712128 +0000 UTC m=+1331.698126944" observedRunningTime="2026-03-19 09:46:55.035516445 +0000 UTC m=+1332.420931261" watchObservedRunningTime="2026-03-19 09:46:55.050040212 +0000 UTC m=+1332.435455028" Mar 19 09:47:08.774902 master-0 kubenswrapper[15202]: I0319 09:47:08.774793 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-8qpd5" Mar 19 09:47:29.557841 master-0 kubenswrapper[15202]: I0319 09:47:29.557736 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h"] Mar 19 09:47:29.562788 master-0 kubenswrapper[15202]: I0319 09:47:29.559632 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:29.596498 master-0 kubenswrapper[15202]: I0319 09:47:29.592229 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m"] Mar 19 09:47:29.596498 master-0 kubenswrapper[15202]: I0319 09:47:29.593546 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:47:29.626114 master-0 kubenswrapper[15202]: I0319 09:47:29.623191 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvxn\" (UniqueName: \"kubernetes.io/projected/7a63b024-b47d-4e28-b8df-db50a3c95bed-kube-api-access-4vvxn\") pod \"barbican-operator-controller-manager-59bc569d95-j929h\" (UID: \"7a63b024-b47d-4e28-b8df-db50a3c95bed\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:29.626114 master-0 kubenswrapper[15202]: I0319 09:47:29.623309 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjdw6\" (UniqueName: \"kubernetes.io/projected/1e2ae390-5000-498a-ad34-866298c5db8c-kube-api-access-mjdw6\") pod \"cinder-operator-controller-manager-8d58dc466-zvf6m\" (UID: \"1e2ae390-5000-498a-ad34-866298c5db8c\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:47:29.631079 master-0 kubenswrapper[15202]: I0319 09:47:29.631029 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m"] Mar 19 09:47:29.660173 master-0 kubenswrapper[15202]: I0319 09:47:29.659906 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h"] Mar 19 09:47:29.685767 master-0 kubenswrapper[15202]: I0319 09:47:29.685709 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n"] Mar 19 09:47:29.687031 master-0 kubenswrapper[15202]: I0319 09:47:29.687007 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:47:29.730752 master-0 kubenswrapper[15202]: I0319 09:47:29.727523 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvxn\" (UniqueName: \"kubernetes.io/projected/7a63b024-b47d-4e28-b8df-db50a3c95bed-kube-api-access-4vvxn\") pod \"barbican-operator-controller-manager-59bc569d95-j929h\" (UID: \"7a63b024-b47d-4e28-b8df-db50a3c95bed\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:29.730752 master-0 kubenswrapper[15202]: I0319 09:47:29.727587 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47v77\" (UniqueName: \"kubernetes.io/projected/e3f266a5-d255-4be9-9205-76f31009a5a5-kube-api-access-47v77\") pod \"designate-operator-controller-manager-588d4d986b-lmp5n\" (UID: \"e3f266a5-d255-4be9-9205-76f31009a5a5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:47:29.730752 master-0 kubenswrapper[15202]: I0319 09:47:29.727661 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjdw6\" (UniqueName: \"kubernetes.io/projected/1e2ae390-5000-498a-ad34-866298c5db8c-kube-api-access-mjdw6\") pod \"cinder-operator-controller-manager-8d58dc466-zvf6m\" (UID: \"1e2ae390-5000-498a-ad34-866298c5db8c\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:47:29.745974 master-0 kubenswrapper[15202]: I0319 09:47:29.745913 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg"] Mar 19 09:47:29.748722 master-0 kubenswrapper[15202]: I0319 09:47:29.747108 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:47:29.793038 master-0 kubenswrapper[15202]: I0319 09:47:29.791537 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n"] Mar 19 09:47:29.795990 master-0 kubenswrapper[15202]: I0319 09:47:29.795919 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvxn\" (UniqueName: \"kubernetes.io/projected/7a63b024-b47d-4e28-b8df-db50a3c95bed-kube-api-access-4vvxn\") pod \"barbican-operator-controller-manager-59bc569d95-j929h\" (UID: \"7a63b024-b47d-4e28-b8df-db50a3c95bed\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:29.814893 master-0 kubenswrapper[15202]: I0319 09:47:29.814706 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg"] Mar 19 09:47:29.825728 master-0 kubenswrapper[15202]: I0319 09:47:29.825620 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk"] Mar 19 09:47:29.834487 master-0 kubenswrapper[15202]: I0319 09:47:29.834403 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:47:29.840656 master-0 kubenswrapper[15202]: I0319 09:47:29.838827 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47v77\" (UniqueName: \"kubernetes.io/projected/e3f266a5-d255-4be9-9205-76f31009a5a5-kube-api-access-47v77\") pod \"designate-operator-controller-manager-588d4d986b-lmp5n\" (UID: \"e3f266a5-d255-4be9-9205-76f31009a5a5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:47:29.840656 master-0 kubenswrapper[15202]: I0319 09:47:29.840191 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjdw6\" (UniqueName: \"kubernetes.io/projected/1e2ae390-5000-498a-ad34-866298c5db8c-kube-api-access-mjdw6\") pod \"cinder-operator-controller-manager-8d58dc466-zvf6m\" (UID: \"1e2ae390-5000-498a-ad34-866298c5db8c\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:47:29.889552 master-0 kubenswrapper[15202]: I0319 09:47:29.889100 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk"] Mar 19 09:47:29.896246 master-0 kubenswrapper[15202]: I0319 09:47:29.896181 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47v77\" (UniqueName: \"kubernetes.io/projected/e3f266a5-d255-4be9-9205-76f31009a5a5-kube-api-access-47v77\") pod \"designate-operator-controller-manager-588d4d986b-lmp5n\" (UID: \"e3f266a5-d255-4be9-9205-76f31009a5a5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:47:29.899599 master-0 kubenswrapper[15202]: I0319 09:47:29.899410 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c"] Mar 19 09:47:29.912908 master-0 kubenswrapper[15202]: I0319 09:47:29.906040 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:29.932413 master-0 kubenswrapper[15202]: I0319 09:47:29.932322 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:47:29.936857 master-0 kubenswrapper[15202]: I0319 09:47:29.936812 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:47:29.945346 master-0 kubenswrapper[15202]: I0319 09:47:29.944497 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xff\" (UniqueName: \"kubernetes.io/projected/b2c05fc4-1191-4f97-bd50-fa0decbafbc5-kube-api-access-h6xff\") pod \"heat-operator-controller-manager-67dd5f86f5-ft2mk\" (UID: \"b2c05fc4-1191-4f97-bd50-fa0decbafbc5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:47:29.945346 master-0 kubenswrapper[15202]: I0319 09:47:29.944851 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwg7d\" (UniqueName: \"kubernetes.io/projected/d5b615b7-8989-4ccb-afa3-85f638a5f8f0-kube-api-access-pwg7d\") pod \"glance-operator-controller-manager-79df6bcc97-sq7cg\" (UID: \"d5b615b7-8989-4ccb-afa3-85f638a5f8f0\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:47:29.988979 master-0 kubenswrapper[15202]: I0319 09:47:29.988906 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9"] Mar 19 09:47:29.990307 master-0 kubenswrapper[15202]: I0319 09:47:29.990047 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:29.998121 master-0 kubenswrapper[15202]: I0319 09:47:29.998069 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 09:47:30.020424 master-0 kubenswrapper[15202]: I0319 09:47:30.020360 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c"] Mar 19 09:47:30.037567 master-0 kubenswrapper[15202]: I0319 09:47:30.037503 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9"] Mar 19 09:47:30.052600 master-0 kubenswrapper[15202]: I0319 09:47:30.048716 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwg7d\" (UniqueName: \"kubernetes.io/projected/d5b615b7-8989-4ccb-afa3-85f638a5f8f0-kube-api-access-pwg7d\") pod \"glance-operator-controller-manager-79df6bcc97-sq7cg\" (UID: \"d5b615b7-8989-4ccb-afa3-85f638a5f8f0\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:47:30.052600 master-0 kubenswrapper[15202]: I0319 09:47:30.048821 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thlmp\" (UniqueName: \"kubernetes.io/projected/4251dd77-2fed-4b65-a847-6f1dfa2ac07b-kube-api-access-thlmp\") pod \"horizon-operator-controller-manager-8464cc45fb-b8s4c\" (UID: \"4251dd77-2fed-4b65-a847-6f1dfa2ac07b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:47:30.052600 master-0 kubenswrapper[15202]: I0319 09:47:30.051714 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:47:30.054894 master-0 kubenswrapper[15202]: I0319 09:47:30.054842 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx"] Mar 19 09:47:30.055637 master-0 kubenswrapper[15202]: I0319 09:47:30.055591 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xff\" (UniqueName: \"kubernetes.io/projected/b2c05fc4-1191-4f97-bd50-fa0decbafbc5-kube-api-access-h6xff\") pod \"heat-operator-controller-manager-67dd5f86f5-ft2mk\" (UID: \"b2c05fc4-1191-4f97-bd50-fa0decbafbc5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:47:30.056151 master-0 kubenswrapper[15202]: I0319 09:47:30.056113 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:30.057447 master-0 kubenswrapper[15202]: I0319 09:47:30.057389 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4g6d\" (UniqueName: \"kubernetes.io/projected/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-kube-api-access-p4g6d\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:30.057534 master-0 kubenswrapper[15202]: I0319 09:47:30.057495 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:47:30.066262 master-0 kubenswrapper[15202]: I0319 09:47:30.066085 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb"] Mar 19 09:47:30.071637 master-0 kubenswrapper[15202]: I0319 09:47:30.069025 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:47:30.129212 master-0 kubenswrapper[15202]: I0319 09:47:30.112372 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwg7d\" (UniqueName: \"kubernetes.io/projected/d5b615b7-8989-4ccb-afa3-85f638a5f8f0-kube-api-access-pwg7d\") pod \"glance-operator-controller-manager-79df6bcc97-sq7cg\" (UID: \"d5b615b7-8989-4ccb-afa3-85f638a5f8f0\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:47:30.129509 master-0 kubenswrapper[15202]: I0319 09:47:30.129376 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xff\" (UniqueName: \"kubernetes.io/projected/b2c05fc4-1191-4f97-bd50-fa0decbafbc5-kube-api-access-h6xff\") pod \"heat-operator-controller-manager-67dd5f86f5-ft2mk\" (UID: \"b2c05fc4-1191-4f97-bd50-fa0decbafbc5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:47:30.163885 master-0 kubenswrapper[15202]: I0319 09:47:30.162389 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thlmp\" (UniqueName: \"kubernetes.io/projected/4251dd77-2fed-4b65-a847-6f1dfa2ac07b-kube-api-access-thlmp\") pod \"horizon-operator-controller-manager-8464cc45fb-b8s4c\" (UID: \"4251dd77-2fed-4b65-a847-6f1dfa2ac07b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:47:30.163885 master-0 kubenswrapper[15202]: I0319 09:47:30.162478 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:30.163885 master-0 kubenswrapper[15202]: I0319 09:47:30.162552 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4g6d\" (UniqueName: \"kubernetes.io/projected/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-kube-api-access-p4g6d\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:30.163885 master-0 kubenswrapper[15202]: I0319 09:47:30.162586 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqbw\" (UniqueName: \"kubernetes.io/projected/cbd311bd-f196-4c7c-b694-ad5cd3c64507-kube-api-access-9xqbw\") pod \"ironic-operator-controller-manager-6f787dddc9-qlfpx\" (UID: \"cbd311bd-f196-4c7c-b694-ad5cd3c64507\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:47:30.163885 master-0 kubenswrapper[15202]: I0319 09:47:30.162644 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxr8f\" (UniqueName: \"kubernetes.io/projected/da0ad24d-f0dc-43fb-939e-0fa4c84473f4-kube-api-access-xxr8f\") pod \"keystone-operator-controller-manager-768b96df4c-kh9rb\" (UID: \"da0ad24d-f0dc-43fb-939e-0fa4c84473f4\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:47:30.172585 master-0 kubenswrapper[15202]: E0319 09:47:30.163018 15202 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:30.172585 master-0 kubenswrapper[15202]: E0319 09:47:30.171956 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert podName:4e3edf6d-9015-4d51-96e3-3c6bd898c4fe nodeName:}" failed. No retries permitted until 2026-03-19 09:47:30.671925633 +0000 UTC m=+1368.057340449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert") pod "infra-operator-controller-manager-7dd6bb94c9-xmlj9" (UID: "4e3edf6d-9015-4d51-96e3-3c6bd898c4fe") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:30.174074 master-0 kubenswrapper[15202]: I0319 09:47:30.163622 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:47:30.185834 master-0 kubenswrapper[15202]: I0319 09:47:30.185320 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thlmp\" (UniqueName: \"kubernetes.io/projected/4251dd77-2fed-4b65-a847-6f1dfa2ac07b-kube-api-access-thlmp\") pod \"horizon-operator-controller-manager-8464cc45fb-b8s4c\" (UID: \"4251dd77-2fed-4b65-a847-6f1dfa2ac07b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:47:30.194075 master-0 kubenswrapper[15202]: I0319 09:47:30.193974 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4g6d\" (UniqueName: \"kubernetes.io/projected/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-kube-api-access-p4g6d\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:30.200446 master-0 kubenswrapper[15202]: I0319 09:47:30.200317 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:47:30.217782 master-0 kubenswrapper[15202]: I0319 09:47:30.217700 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx"] Mar 19 09:47:30.240504 master-0 kubenswrapper[15202]: I0319 09:47:30.240429 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9"] Mar 19 09:47:30.241938 master-0 kubenswrapper[15202]: I0319 09:47:30.241911 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:47:30.258608 master-0 kubenswrapper[15202]: I0319 09:47:30.258518 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb"] Mar 19 09:47:30.271447 master-0 kubenswrapper[15202]: I0319 09:47:30.271226 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9"] Mar 19 09:47:30.276992 master-0 kubenswrapper[15202]: I0319 09:47:30.276345 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqbw\" (UniqueName: \"kubernetes.io/projected/cbd311bd-f196-4c7c-b694-ad5cd3c64507-kube-api-access-9xqbw\") pod \"ironic-operator-controller-manager-6f787dddc9-qlfpx\" (UID: \"cbd311bd-f196-4c7c-b694-ad5cd3c64507\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:47:30.276992 master-0 kubenswrapper[15202]: I0319 09:47:30.276419 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxr8f\" (UniqueName: \"kubernetes.io/projected/da0ad24d-f0dc-43fb-939e-0fa4c84473f4-kube-api-access-xxr8f\") pod \"keystone-operator-controller-manager-768b96df4c-kh9rb\" (UID: \"da0ad24d-f0dc-43fb-939e-0fa4c84473f4\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:47:30.290500 master-0 kubenswrapper[15202]: I0319 09:47:30.281715 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:47:30.325274 master-0 kubenswrapper[15202]: I0319 09:47:30.319746 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxr8f\" (UniqueName: \"kubernetes.io/projected/da0ad24d-f0dc-43fb-939e-0fa4c84473f4-kube-api-access-xxr8f\") pod \"keystone-operator-controller-manager-768b96df4c-kh9rb\" (UID: \"da0ad24d-f0dc-43fb-939e-0fa4c84473f4\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:47:30.333952 master-0 kubenswrapper[15202]: I0319 09:47:30.327867 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqbw\" (UniqueName: \"kubernetes.io/projected/cbd311bd-f196-4c7c-b694-ad5cd3c64507-kube-api-access-9xqbw\") pod \"ironic-operator-controller-manager-6f787dddc9-qlfpx\" (UID: \"cbd311bd-f196-4c7c-b694-ad5cd3c64507\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:47:30.397047 master-0 kubenswrapper[15202]: I0319 09:47:30.378854 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dskx\" (UniqueName: \"kubernetes.io/projected/2fefcbf8-dda4-48d9-b725-4dec37f3bfd9-kube-api-access-2dskx\") pod \"manila-operator-controller-manager-55f864c847-6n7n9\" (UID: \"2fefcbf8-dda4-48d9-b725-4dec37f3bfd9\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:47:30.501446 master-0 kubenswrapper[15202]: I0319 09:47:30.501356 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr"] Mar 19 09:47:30.502097 master-0 kubenswrapper[15202]: I0319 09:47:30.502037 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dskx\" (UniqueName: \"kubernetes.io/projected/2fefcbf8-dda4-48d9-b725-4dec37f3bfd9-kube-api-access-2dskx\") pod \"manila-operator-controller-manager-55f864c847-6n7n9\" (UID: \"2fefcbf8-dda4-48d9-b725-4dec37f3bfd9\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:47:30.542594 master-0 kubenswrapper[15202]: I0319 09:47:30.503522 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:47:30.543209 master-0 kubenswrapper[15202]: I0319 09:47:30.529751 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:47:30.553464 master-0 kubenswrapper[15202]: I0319 09:47:30.551526 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-r78pl"] Mar 19 09:47:30.553464 master-0 kubenswrapper[15202]: I0319 09:47:30.551660 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:47:30.558674 master-0 kubenswrapper[15202]: I0319 09:47:30.558642 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dskx\" (UniqueName: \"kubernetes.io/projected/2fefcbf8-dda4-48d9-b725-4dec37f3bfd9-kube-api-access-2dskx\") pod \"manila-operator-controller-manager-55f864c847-6n7n9\" (UID: \"2fefcbf8-dda4-48d9-b725-4dec37f3bfd9\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:47:30.560164 master-0 kubenswrapper[15202]: I0319 09:47:30.560144 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:47:30.606950 master-0 kubenswrapper[15202]: I0319 09:47:30.606524 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr"] Mar 19 09:47:30.633969 master-0 kubenswrapper[15202]: I0319 09:47:30.631867 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:47:30.650609 master-0 kubenswrapper[15202]: I0319 09:47:30.648543 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-r78pl"] Mar 19 09:47:30.662769 master-0 kubenswrapper[15202]: I0319 09:47:30.662337 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk"] Mar 19 09:47:30.664610 master-0 kubenswrapper[15202]: I0319 09:47:30.663954 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:47:30.679958 master-0 kubenswrapper[15202]: I0319 09:47:30.679883 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h"] Mar 19 09:47:30.708075 master-0 kubenswrapper[15202]: I0319 09:47:30.707881 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:30.708194 master-0 kubenswrapper[15202]: I0319 09:47:30.708085 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp76s\" (UniqueName: \"kubernetes.io/projected/9af14adc-2a84-4aa0-88a2-2ccc5de52d8a-kube-api-access-kp76s\") pod \"mariadb-operator-controller-manager-67ccfc9778-s5trr\" (UID: \"9af14adc-2a84-4aa0-88a2-2ccc5de52d8a\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:47:30.708194 master-0 kubenswrapper[15202]: I0319 09:47:30.708122 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xz2z\" (UniqueName: \"kubernetes.io/projected/2986b2af-8536-4532-a416-d1092b24cad2-kube-api-access-9xz2z\") pod \"neutron-operator-controller-manager-767865f676-r78pl\" (UID: \"2986b2af-8536-4532-a416-d1092b24cad2\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:47:30.708307 master-0 kubenswrapper[15202]: E0319 09:47:30.708273 15202 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:30.708351 master-0 kubenswrapper[15202]: E0319 09:47:30.708330 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert podName:4e3edf6d-9015-4d51-96e3-3c6bd898c4fe nodeName:}" failed. No retries permitted until 2026-03-19 09:47:31.708314901 +0000 UTC m=+1369.093729717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert") pod "infra-operator-controller-manager-7dd6bb94c9-xmlj9" (UID: "4e3edf6d-9015-4d51-96e3-3c6bd898c4fe") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:30.717518 master-0 kubenswrapper[15202]: I0319 09:47:30.715580 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:47:30.725870 master-0 kubenswrapper[15202]: I0319 09:47:30.725819 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk"] Mar 19 09:47:30.733981 master-0 kubenswrapper[15202]: I0319 09:47:30.733756 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h"] Mar 19 09:47:30.751745 master-0 kubenswrapper[15202]: I0319 09:47:30.751705 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b"] Mar 19 09:47:30.753018 master-0 kubenswrapper[15202]: I0319 09:47:30.752994 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:47:30.772799 master-0 kubenswrapper[15202]: I0319 09:47:30.772733 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7"] Mar 19 09:47:30.781285 master-0 kubenswrapper[15202]: I0319 09:47:30.781242 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:30.788901 master-0 kubenswrapper[15202]: I0319 09:47:30.788849 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 09:47:30.797482 master-0 kubenswrapper[15202]: W0319 09:47:30.797413 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2ae390_5000_498a_ad34_866298c5db8c.slice/crio-a43a7dbabea207b978f81a51788c9094f1c05716849ad4abbac31d7f6ed6953f WatchSource:0}: Error finding container a43a7dbabea207b978f81a51788c9094f1c05716849ad4abbac31d7f6ed6953f: Status 404 returned error can't find the container with id a43a7dbabea207b978f81a51788c9094f1c05716849ad4abbac31d7f6ed6953f Mar 19 09:47:30.810446 master-0 kubenswrapper[15202]: I0319 09:47:30.810398 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84h2\" (UniqueName: \"kubernetes.io/projected/85a8f0de-e96a-4be5-8980-36532e0fa45c-kube-api-access-d84h2\") pod \"ovn-operator-controller-manager-884679f54-7fq2b\" (UID: \"85a8f0de-e96a-4be5-8980-36532e0fa45c\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:47:30.810548 master-0 kubenswrapper[15202]: I0319 09:47:30.810499 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jmf\" (UniqueName: \"kubernetes.io/projected/a74e399f-b504-430a-a7a7-1aa15487157f-kube-api-access-46jmf\") pod \"octavia-operator-controller-manager-5b9f45d989-jv72h\" (UID: \"a74e399f-b504-430a-a7a7-1aa15487157f\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:47:30.811906 master-0 kubenswrapper[15202]: I0319 09:47:30.811875 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvh2t\" (UniqueName: \"kubernetes.io/projected/f48bfc98-b4af-4ebe-b96f-ac119a0887d4-kube-api-access-pvh2t\") pod \"nova-operator-controller-manager-5d488d59fb-pw2xk\" (UID: \"f48bfc98-b4af-4ebe-b96f-ac119a0887d4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:47:30.811960 master-0 kubenswrapper[15202]: I0319 09:47:30.811913 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp76s\" (UniqueName: \"kubernetes.io/projected/9af14adc-2a84-4aa0-88a2-2ccc5de52d8a-kube-api-access-kp76s\") pod \"mariadb-operator-controller-manager-67ccfc9778-s5trr\" (UID: \"9af14adc-2a84-4aa0-88a2-2ccc5de52d8a\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:47:30.811960 master-0 kubenswrapper[15202]: I0319 09:47:30.811940 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xz2z\" (UniqueName: \"kubernetes.io/projected/2986b2af-8536-4532-a416-d1092b24cad2-kube-api-access-9xz2z\") pod \"neutron-operator-controller-manager-767865f676-r78pl\" (UID: \"2986b2af-8536-4532-a416-d1092b24cad2\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:47:30.812648 master-0 kubenswrapper[15202]: I0319 09:47:30.812455 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b"] Mar 19 09:47:30.842652 master-0 kubenswrapper[15202]: I0319 09:47:30.842554 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp76s\" (UniqueName: \"kubernetes.io/projected/9af14adc-2a84-4aa0-88a2-2ccc5de52d8a-kube-api-access-kp76s\") pod \"mariadb-operator-controller-manager-67ccfc9778-s5trr\" (UID: \"9af14adc-2a84-4aa0-88a2-2ccc5de52d8a\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:47:30.847814 master-0 kubenswrapper[15202]: I0319 09:47:30.847772 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xz2z\" (UniqueName: \"kubernetes.io/projected/2986b2af-8536-4532-a416-d1092b24cad2-kube-api-access-9xz2z\") pod \"neutron-operator-controller-manager-767865f676-r78pl\" (UID: \"2986b2af-8536-4532-a416-d1092b24cad2\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:47:30.897281 master-0 kubenswrapper[15202]: I0319 09:47:30.894351 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7"] Mar 19 09:47:30.897281 master-0 kubenswrapper[15202]: I0319 09:47:30.894575 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx"] Mar 19 09:47:30.897281 master-0 kubenswrapper[15202]: I0319 09:47:30.895707 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:47:30.936303 master-0 kubenswrapper[15202]: I0319 09:47:30.936260 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvh2t\" (UniqueName: \"kubernetes.io/projected/f48bfc98-b4af-4ebe-b96f-ac119a0887d4-kube-api-access-pvh2t\") pod \"nova-operator-controller-manager-5d488d59fb-pw2xk\" (UID: \"f48bfc98-b4af-4ebe-b96f-ac119a0887d4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:47:30.936529 master-0 kubenswrapper[15202]: I0319 09:47:30.936506 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtk2v\" (UniqueName: \"kubernetes.io/projected/aeb8435e-9818-471c-b779-9e40d7084842-kube-api-access-qtk2v\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:30.936909 master-0 kubenswrapper[15202]: I0319 09:47:30.936859 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d84h2\" (UniqueName: \"kubernetes.io/projected/85a8f0de-e96a-4be5-8980-36532e0fa45c-kube-api-access-d84h2\") pod \"ovn-operator-controller-manager-884679f54-7fq2b\" (UID: \"85a8f0de-e96a-4be5-8980-36532e0fa45c\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:47:30.937021 master-0 kubenswrapper[15202]: I0319 09:47:30.936999 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:30.937167 master-0 kubenswrapper[15202]: I0319 09:47:30.937083 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jmf\" (UniqueName: \"kubernetes.io/projected/a74e399f-b504-430a-a7a7-1aa15487157f-kube-api-access-46jmf\") pod \"octavia-operator-controller-manager-5b9f45d989-jv72h\" (UID: \"a74e399f-b504-430a-a7a7-1aa15487157f\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:47:30.952047 master-0 kubenswrapper[15202]: I0319 09:47:30.950940 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx"] Mar 19 09:47:30.955776 master-0 kubenswrapper[15202]: I0319 09:47:30.955200 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvh2t\" (UniqueName: \"kubernetes.io/projected/f48bfc98-b4af-4ebe-b96f-ac119a0887d4-kube-api-access-pvh2t\") pod \"nova-operator-controller-manager-5d488d59fb-pw2xk\" (UID: \"f48bfc98-b4af-4ebe-b96f-ac119a0887d4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:47:30.960262 master-0 kubenswrapper[15202]: I0319 09:47:30.960212 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jmf\" (UniqueName: \"kubernetes.io/projected/a74e399f-b504-430a-a7a7-1aa15487157f-kube-api-access-46jmf\") pod \"octavia-operator-controller-manager-5b9f45d989-jv72h\" (UID: \"a74e399f-b504-430a-a7a7-1aa15487157f\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:47:30.972733 master-0 kubenswrapper[15202]: I0319 09:47:30.963883 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d84h2\" (UniqueName: \"kubernetes.io/projected/85a8f0de-e96a-4be5-8980-36532e0fa45c-kube-api-access-d84h2\") pod \"ovn-operator-controller-manager-884679f54-7fq2b\" (UID: \"85a8f0de-e96a-4be5-8980-36532e0fa45c\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:47:30.975900 master-0 kubenswrapper[15202]: I0319 09:47:30.975836 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz"] Mar 19 09:47:30.977192 master-0 kubenswrapper[15202]: I0319 09:47:30.977166 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:47:30.984240 master-0 kubenswrapper[15202]: I0319 09:47:30.984182 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-65d6b"] Mar 19 09:47:30.993526 master-0 kubenswrapper[15202]: I0319 09:47:30.993393 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:47:31.001736 master-0 kubenswrapper[15202]: I0319 09:47:31.001536 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz"] Mar 19 09:47:31.007434 master-0 kubenswrapper[15202]: I0319 09:47:31.007367 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:47:31.016060 master-0 kubenswrapper[15202]: I0319 09:47:31.015956 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-65d6b"] Mar 19 09:47:31.034050 master-0 kubenswrapper[15202]: I0319 09:47:31.033212 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj"] Mar 19 09:47:31.034331 master-0 kubenswrapper[15202]: I0319 09:47:31.034311 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:47:31.039020 master-0 kubenswrapper[15202]: I0319 09:47:31.038895 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtk2v\" (UniqueName: \"kubernetes.io/projected/aeb8435e-9818-471c-b779-9e40d7084842-kube-api-access-qtk2v\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:31.039020 master-0 kubenswrapper[15202]: I0319 09:47:31.038959 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gpdl\" (UniqueName: \"kubernetes.io/projected/3b73f526-cd94-4119-bcc1-8aa00e58b6ce-kube-api-access-6gpdl\") pod \"placement-operator-controller-manager-5784578c99-4tjlx\" (UID: \"3b73f526-cd94-4119-bcc1-8aa00e58b6ce\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:47:31.039020 master-0 kubenswrapper[15202]: I0319 09:47:31.039007 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:31.039367 master-0 kubenswrapper[15202]: E0319 09:47:31.039122 15202 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:31.039367 master-0 kubenswrapper[15202]: E0319 09:47:31.039174 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert podName:aeb8435e-9818-471c-b779-9e40d7084842 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:31.539157407 +0000 UTC m=+1368.924572223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899dzhg7" (UID: "aeb8435e-9818-471c-b779-9e40d7084842") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:31.045798 master-0 kubenswrapper[15202]: I0319 09:47:31.041961 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:47:31.046261 master-0 kubenswrapper[15202]: I0319 09:47:31.046212 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj"] Mar 19 09:47:31.062699 master-0 kubenswrapper[15202]: I0319 09:47:31.061991 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtk2v\" (UniqueName: \"kubernetes.io/projected/aeb8435e-9818-471c-b779-9e40d7084842-kube-api-access-qtk2v\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:31.063501 master-0 kubenswrapper[15202]: I0319 09:47:31.063432 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv"] Mar 19 09:47:31.064831 master-0 kubenswrapper[15202]: I0319 09:47:31.064802 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:47:31.087740 master-0 kubenswrapper[15202]: I0319 09:47:31.087563 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv"] Mar 19 09:47:31.091448 master-0 kubenswrapper[15202]: I0319 09:47:31.091413 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:47:31.120922 master-0 kubenswrapper[15202]: I0319 09:47:31.120876 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g"] Mar 19 09:47:31.122391 master-0 kubenswrapper[15202]: I0319 09:47:31.122360 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.124958 master-0 kubenswrapper[15202]: I0319 09:47:31.124769 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 09:47:31.125940 master-0 kubenswrapper[15202]: I0319 09:47:31.125901 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 09:47:31.138444 master-0 kubenswrapper[15202]: I0319 09:47:31.138378 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g"] Mar 19 09:47:31.140685 master-0 kubenswrapper[15202]: I0319 09:47:31.140635 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq74r\" (UniqueName: \"kubernetes.io/projected/3f2734d7-dd75-4742-aa62-2adf7d60700e-kube-api-access-lq74r\") pod \"test-operator-controller-manager-5c5cb9c4d7-5znsj\" (UID: \"3f2734d7-dd75-4742-aa62-2adf7d60700e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:47:31.140778 master-0 kubenswrapper[15202]: I0319 09:47:31.140746 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5gwr\" (UniqueName: \"kubernetes.io/projected/9ca4d989-df3a-4b66-a18f-21dffbc966c8-kube-api-access-r5gwr\") pod \"telemetry-operator-controller-manager-d6b694c5-j5ggz\" (UID: \"9ca4d989-df3a-4b66-a18f-21dffbc966c8\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:47:31.141040 master-0 kubenswrapper[15202]: I0319 09:47:31.141007 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jvm\" (UniqueName: \"kubernetes.io/projected/4bf9e14b-a962-4ea4-be38-0186a20a5da5-kube-api-access-n5jvm\") pod \"swift-operator-controller-manager-c674c5965-65d6b\" (UID: \"4bf9e14b-a962-4ea4-be38-0186a20a5da5\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:47:31.141109 master-0 kubenswrapper[15202]: I0319 09:47:31.141077 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gpdl\" (UniqueName: \"kubernetes.io/projected/3b73f526-cd94-4119-bcc1-8aa00e58b6ce-kube-api-access-6gpdl\") pod \"placement-operator-controller-manager-5784578c99-4tjlx\" (UID: \"3b73f526-cd94-4119-bcc1-8aa00e58b6ce\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:47:31.179556 master-0 kubenswrapper[15202]: I0319 09:47:31.150390 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4"] Mar 19 09:47:31.179556 master-0 kubenswrapper[15202]: I0319 09:47:31.160376 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" Mar 19 09:47:31.179556 master-0 kubenswrapper[15202]: I0319 09:47:31.173233 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gpdl\" (UniqueName: \"kubernetes.io/projected/3b73f526-cd94-4119-bcc1-8aa00e58b6ce-kube-api-access-6gpdl\") pod \"placement-operator-controller-manager-5784578c99-4tjlx\" (UID: \"3b73f526-cd94-4119-bcc1-8aa00e58b6ce\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:47:31.182903 master-0 kubenswrapper[15202]: I0319 09:47:31.181879 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4"] Mar 19 09:47:31.262215 master-0 kubenswrapper[15202]: I0319 09:47:31.261148 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:47:31.279504 master-0 kubenswrapper[15202]: I0319 09:47:31.277431 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:47:31.285622 master-0 kubenswrapper[15202]: I0319 09:47:31.285531 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.307533 master-0 kubenswrapper[15202]: I0319 09:47:31.307385 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq74r\" (UniqueName: \"kubernetes.io/projected/3f2734d7-dd75-4742-aa62-2adf7d60700e-kube-api-access-lq74r\") pod \"test-operator-controller-manager-5c5cb9c4d7-5znsj\" (UID: \"3f2734d7-dd75-4742-aa62-2adf7d60700e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:47:31.307762 master-0 kubenswrapper[15202]: I0319 09:47:31.307624 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5gwr\" (UniqueName: \"kubernetes.io/projected/9ca4d989-df3a-4b66-a18f-21dffbc966c8-kube-api-access-r5gwr\") pod \"telemetry-operator-controller-manager-d6b694c5-j5ggz\" (UID: \"9ca4d989-df3a-4b66-a18f-21dffbc966c8\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:47:31.307762 master-0 kubenswrapper[15202]: I0319 09:47:31.307691 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q642f\" (UniqueName: \"kubernetes.io/projected/6dfce486-4dc5-4001-9a3d-06149d764ea0-kube-api-access-q642f\") pod \"watcher-operator-controller-manager-6c4d75f7f9-2pmjv\" (UID: \"6dfce486-4dc5-4001-9a3d-06149d764ea0\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:47:31.307762 master-0 kubenswrapper[15202]: I0319 09:47:31.307739 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngb8p\" (UniqueName: \"kubernetes.io/projected/5c7c160b-03f5-4120-9169-6c15f43bc781-kube-api-access-ngb8p\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.308569 master-0 kubenswrapper[15202]: I0319 09:47:31.308498 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jvm\" (UniqueName: \"kubernetes.io/projected/4bf9e14b-a962-4ea4-be38-0186a20a5da5-kube-api-access-n5jvm\") pod \"swift-operator-controller-manager-c674c5965-65d6b\" (UID: \"4bf9e14b-a962-4ea4-be38-0186a20a5da5\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:47:31.308897 master-0 kubenswrapper[15202]: I0319 09:47:31.308817 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.309336 master-0 kubenswrapper[15202]: I0319 09:47:31.309296 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gd49\" (UniqueName: \"kubernetes.io/projected/ad41e9ab-a275-4b15-9fcf-5ca0da404d52-kube-api-access-2gd49\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bbgx4\" (UID: \"ad41e9ab-a275-4b15-9fcf-5ca0da404d52\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" Mar 19 09:47:31.309697 master-0 kubenswrapper[15202]: I0319 09:47:31.309649 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m"] Mar 19 09:47:31.338018 master-0 kubenswrapper[15202]: I0319 09:47:31.336970 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5gwr\" (UniqueName: \"kubernetes.io/projected/9ca4d989-df3a-4b66-a18f-21dffbc966c8-kube-api-access-r5gwr\") pod \"telemetry-operator-controller-manager-d6b694c5-j5ggz\" (UID: \"9ca4d989-df3a-4b66-a18f-21dffbc966c8\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:47:31.338018 master-0 kubenswrapper[15202]: I0319 09:47:31.337422 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq74r\" (UniqueName: \"kubernetes.io/projected/3f2734d7-dd75-4742-aa62-2adf7d60700e-kube-api-access-lq74r\") pod \"test-operator-controller-manager-5c5cb9c4d7-5znsj\" (UID: \"3f2734d7-dd75-4742-aa62-2adf7d60700e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:47:31.343593 master-0 kubenswrapper[15202]: I0319 09:47:31.343546 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jvm\" (UniqueName: \"kubernetes.io/projected/4bf9e14b-a962-4ea4-be38-0186a20a5da5-kube-api-access-n5jvm\") pod \"swift-operator-controller-manager-c674c5965-65d6b\" (UID: \"4bf9e14b-a962-4ea4-be38-0186a20a5da5\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:47:31.352025 master-0 kubenswrapper[15202]: I0319 09:47:31.351316 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:47:31.368651 master-0 kubenswrapper[15202]: I0319 09:47:31.368577 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:47:31.385675 master-0 kubenswrapper[15202]: I0319 09:47:31.385551 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h"] Mar 19 09:47:31.388030 master-0 kubenswrapper[15202]: I0319 09:47:31.387987 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:47:31.419450 master-0 kubenswrapper[15202]: I0319 09:47:31.411436 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:47:31.430545 master-0 kubenswrapper[15202]: E0319 09:47:31.430427 15202 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:47:31.430731 master-0 kubenswrapper[15202]: E0319 09:47:31.430699 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:31.930609056 +0000 UTC m=+1369.316023882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "metrics-server-cert" not found Mar 19 09:47:31.432071 master-0 kubenswrapper[15202]: I0319 09:47:31.429941 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.432071 master-0 kubenswrapper[15202]: I0319 09:47:31.431588 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gd49\" (UniqueName: \"kubernetes.io/projected/ad41e9ab-a275-4b15-9fcf-5ca0da404d52-kube-api-access-2gd49\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bbgx4\" (UID: \"ad41e9ab-a275-4b15-9fcf-5ca0da404d52\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" Mar 19 09:47:31.432071 master-0 kubenswrapper[15202]: I0319 09:47:31.431795 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.432071 master-0 kubenswrapper[15202]: I0319 09:47:31.431947 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q642f\" (UniqueName: \"kubernetes.io/projected/6dfce486-4dc5-4001-9a3d-06149d764ea0-kube-api-access-q642f\") pod \"watcher-operator-controller-manager-6c4d75f7f9-2pmjv\" (UID: \"6dfce486-4dc5-4001-9a3d-06149d764ea0\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:47:31.432071 master-0 kubenswrapper[15202]: I0319 09:47:31.432009 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngb8p\" (UniqueName: \"kubernetes.io/projected/5c7c160b-03f5-4120-9169-6c15f43bc781-kube-api-access-ngb8p\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.433255 master-0 kubenswrapper[15202]: E0319 09:47:31.432761 15202 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:47:31.433255 master-0 kubenswrapper[15202]: E0319 09:47:31.432876 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:31.932847072 +0000 UTC m=+1369.318261878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "webhook-server-cert" not found Mar 19 09:47:31.470260 master-0 kubenswrapper[15202]: I0319 09:47:31.469776 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q642f\" (UniqueName: \"kubernetes.io/projected/6dfce486-4dc5-4001-9a3d-06149d764ea0-kube-api-access-q642f\") pod \"watcher-operator-controller-manager-6c4d75f7f9-2pmjv\" (UID: \"6dfce486-4dc5-4001-9a3d-06149d764ea0\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:47:31.473513 master-0 kubenswrapper[15202]: I0319 09:47:31.473156 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngb8p\" (UniqueName: \"kubernetes.io/projected/5c7c160b-03f5-4120-9169-6c15f43bc781-kube-api-access-ngb8p\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.480694 master-0 kubenswrapper[15202]: I0319 09:47:31.477025 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk"] Mar 19 09:47:31.482529 master-0 kubenswrapper[15202]: I0319 09:47:31.482116 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gd49\" (UniqueName: \"kubernetes.io/projected/ad41e9ab-a275-4b15-9fcf-5ca0da404d52-kube-api-access-2gd49\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bbgx4\" (UID: \"ad41e9ab-a275-4b15-9fcf-5ca0da404d52\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" Mar 19 09:47:31.486101 master-0 kubenswrapper[15202]: I0319 09:47:31.486034 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n"] Mar 19 09:47:31.529909 master-0 kubenswrapper[15202]: I0319 09:47:31.529837 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c"] Mar 19 09:47:31.546519 master-0 kubenswrapper[15202]: I0319 09:47:31.536426 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg"] Mar 19 09:47:31.592493 master-0 kubenswrapper[15202]: I0319 09:47:31.590284 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" Mar 19 09:47:31.645951 master-0 kubenswrapper[15202]: I0319 09:47:31.637079 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:31.645951 master-0 kubenswrapper[15202]: E0319 09:47:31.637327 15202 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:31.645951 master-0 kubenswrapper[15202]: E0319 09:47:31.637383 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert podName:aeb8435e-9818-471c-b779-9e40d7084842 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:32.637368668 +0000 UTC m=+1370.022783484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899dzhg7" (UID: "aeb8435e-9818-471c-b779-9e40d7084842") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:31.721494 master-0 kubenswrapper[15202]: I0319 09:47:31.721119 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:47:31.752943 master-0 kubenswrapper[15202]: I0319 09:47:31.742246 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:31.752943 master-0 kubenswrapper[15202]: E0319 09:47:31.742587 15202 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:31.752943 master-0 kubenswrapper[15202]: E0319 09:47:31.743062 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert podName:4e3edf6d-9015-4d51-96e3-3c6bd898c4fe nodeName:}" failed. No retries permitted until 2026-03-19 09:47:33.743039499 +0000 UTC m=+1371.128454305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert") pod "infra-operator-controller-manager-7dd6bb94c9-xmlj9" (UID: "4e3edf6d-9015-4d51-96e3-3c6bd898c4fe") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:31.762694 master-0 kubenswrapper[15202]: I0319 09:47:31.756368 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" event={"ID":"e3f266a5-d255-4be9-9205-76f31009a5a5","Type":"ContainerStarted","Data":"0f4a304c6566aa6cd5432b5d1e4e56f3efb34d352e65cccf0484445224a6c9b6"} Mar 19 09:47:31.770397 master-0 kubenswrapper[15202]: I0319 09:47:31.767246 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" event={"ID":"b2c05fc4-1191-4f97-bd50-fa0decbafbc5","Type":"ContainerStarted","Data":"44efeedec2905e2363e2d3588c50def93952a7bb9fc2c28847484e474937ddf0"} Mar 19 09:47:31.846531 master-0 kubenswrapper[15202]: I0319 09:47:31.785766 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" event={"ID":"7a63b024-b47d-4e28-b8df-db50a3c95bed","Type":"ContainerStarted","Data":"144cc137d7fd42618b6eeefc4e698e85e6603b5f1a7c67a55064ac753b2ae50e"} Mar 19 09:47:31.846531 master-0 kubenswrapper[15202]: I0319 09:47:31.806415 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" event={"ID":"1e2ae390-5000-498a-ad34-866298c5db8c","Type":"ContainerStarted","Data":"a43a7dbabea207b978f81a51788c9094f1c05716849ad4abbac31d7f6ed6953f"} Mar 19 09:47:31.956601 master-0 kubenswrapper[15202]: I0319 09:47:31.956101 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.956601 master-0 kubenswrapper[15202]: I0319 09:47:31.956244 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:31.986031 master-0 kubenswrapper[15202]: E0319 09:47:31.968420 15202 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:47:31.986031 master-0 kubenswrapper[15202]: E0319 09:47:31.968501 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:32.96848461 +0000 UTC m=+1370.353899426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "metrics-server-cert" not found Mar 19 09:47:31.986031 master-0 kubenswrapper[15202]: E0319 09:47:31.969374 15202 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:47:31.986031 master-0 kubenswrapper[15202]: E0319 09:47:31.969405 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:32.969396863 +0000 UTC m=+1370.354811679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "webhook-server-cert" not found Mar 19 09:47:32.031892 master-0 kubenswrapper[15202]: I0319 09:47:32.031785 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb"] Mar 19 09:47:32.082535 master-0 kubenswrapper[15202]: I0319 09:47:32.080413 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9"] Mar 19 09:47:32.110378 master-0 kubenswrapper[15202]: W0319 09:47:32.110268 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd311bd_f196_4c7c_b694_ad5cd3c64507.slice/crio-06dbcc3e83284b227ed6fe95361afcd3044e385da11daee9f2e7e76ce3f03d87 WatchSource:0}: Error finding container 06dbcc3e83284b227ed6fe95361afcd3044e385da11daee9f2e7e76ce3f03d87: Status 404 returned error can't find the container with id 06dbcc3e83284b227ed6fe95361afcd3044e385da11daee9f2e7e76ce3f03d87 Mar 19 09:47:32.152222 master-0 kubenswrapper[15202]: I0319 09:47:32.129412 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx"] Mar 19 09:47:32.217643 master-0 kubenswrapper[15202]: I0319 09:47:32.217597 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-r78pl"] Mar 19 09:47:32.277337 master-0 kubenswrapper[15202]: I0319 09:47:32.277049 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr"] Mar 19 09:47:32.302186 master-0 kubenswrapper[15202]: I0319 09:47:32.302083 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk"] Mar 19 09:47:32.524596 master-0 kubenswrapper[15202]: I0319 09:47:32.517111 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz"] Mar 19 09:47:32.546902 master-0 kubenswrapper[15202]: W0319 09:47:32.543975 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca4d989_df3a_4b66_a18f_21dffbc966c8.slice/crio-8619ed31fc6f94226528a6573a808f71f9e8b1773a375e0622f356d1b86a1601 WatchSource:0}: Error finding container 8619ed31fc6f94226528a6573a808f71f9e8b1773a375e0622f356d1b86a1601: Status 404 returned error can't find the container with id 8619ed31fc6f94226528a6573a808f71f9e8b1773a375e0622f356d1b86a1601 Mar 19 09:47:32.673534 master-0 kubenswrapper[15202]: I0319 09:47:32.671941 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:32.673534 master-0 kubenswrapper[15202]: E0319 09:47:32.672228 15202 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:32.673534 master-0 kubenswrapper[15202]: E0319 09:47:32.672356 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert podName:aeb8435e-9818-471c-b779-9e40d7084842 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:34.672336092 +0000 UTC m=+1372.057750898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899dzhg7" (UID: "aeb8435e-9818-471c-b779-9e40d7084842") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:32.684504 master-0 kubenswrapper[15202]: I0319 09:47:32.681373 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h"] Mar 19 09:47:32.698980 master-0 kubenswrapper[15202]: I0319 09:47:32.696584 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b"] Mar 19 09:47:32.721959 master-0 kubenswrapper[15202]: W0319 09:47:32.721868 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85a8f0de_e96a_4be5_8980_36532e0fa45c.slice/crio-5acb80ce2b9ed0659d1a467d34029374bd9e2a181d87fa20e57e1baf57bc13ef WatchSource:0}: Error finding container 5acb80ce2b9ed0659d1a467d34029374bd9e2a181d87fa20e57e1baf57bc13ef: Status 404 returned error can't find the container with id 5acb80ce2b9ed0659d1a467d34029374bd9e2a181d87fa20e57e1baf57bc13ef Mar 19 09:47:32.739762 master-0 kubenswrapper[15202]: W0319 09:47:32.737903 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74e399f_b504_430a_a7a7_1aa15487157f.slice/crio-0d48c714c5ee7e7d26d433f1bdf31d161f5e32633a666b11034a576be9fe97f9 WatchSource:0}: Error finding container 0d48c714c5ee7e7d26d433f1bdf31d161f5e32633a666b11034a576be9fe97f9: Status 404 returned error can't find the container with id 0d48c714c5ee7e7d26d433f1bdf31d161f5e32633a666b11034a576be9fe97f9 Mar 19 09:47:32.879815 master-0 kubenswrapper[15202]: I0319 09:47:32.879623 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" event={"ID":"4251dd77-2fed-4b65-a847-6f1dfa2ac07b","Type":"ContainerStarted","Data":"63000616b05afe8cced95ac996cf87fe642af8ddd5df7c968af5283c7f490f48"} Mar 19 09:47:32.886866 master-0 kubenswrapper[15202]: I0319 09:47:32.886735 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" event={"ID":"cbd311bd-f196-4c7c-b694-ad5cd3c64507","Type":"ContainerStarted","Data":"06dbcc3e83284b227ed6fe95361afcd3044e385da11daee9f2e7e76ce3f03d87"} Mar 19 09:47:32.890228 master-0 kubenswrapper[15202]: I0319 09:47:32.890182 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" event={"ID":"2986b2af-8536-4532-a416-d1092b24cad2","Type":"ContainerStarted","Data":"0db415899009c3138b3e0ae98d792ca5c8f9d52970cd960feae01908a3a72977"} Mar 19 09:47:32.896550 master-0 kubenswrapper[15202]: I0319 09:47:32.896493 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" event={"ID":"9af14adc-2a84-4aa0-88a2-2ccc5de52d8a","Type":"ContainerStarted","Data":"01741231811b457f043999f5c80609923ef4707fba95ecd87b6f9b936daf1bcb"} Mar 19 09:47:32.905853 master-0 kubenswrapper[15202]: I0319 09:47:32.905785 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" event={"ID":"2fefcbf8-dda4-48d9-b725-4dec37f3bfd9","Type":"ContainerStarted","Data":"4aac61763ec428991011b96f3f10fe3acfdb82c54e9916ed7f5655cf58657afb"} Mar 19 09:47:32.911254 master-0 kubenswrapper[15202]: I0319 09:47:32.911189 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" event={"ID":"a74e399f-b504-430a-a7a7-1aa15487157f","Type":"ContainerStarted","Data":"0d48c714c5ee7e7d26d433f1bdf31d161f5e32633a666b11034a576be9fe97f9"} Mar 19 09:47:32.923455 master-0 kubenswrapper[15202]: I0319 09:47:32.923340 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" event={"ID":"d5b615b7-8989-4ccb-afa3-85f638a5f8f0","Type":"ContainerStarted","Data":"354a80b2ce5fdeecdd8fe64d7408ef894f135d00eb498db50752ed0459e5bcc8"} Mar 19 09:47:32.954396 master-0 kubenswrapper[15202]: I0319 09:47:32.954288 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" event={"ID":"da0ad24d-f0dc-43fb-939e-0fa4c84473f4","Type":"ContainerStarted","Data":"f41fd37c1ca4c6eef5fc483c527545d243563e286d53a3b6d30d7d539e88175f"} Mar 19 09:47:32.957326 master-0 kubenswrapper[15202]: I0319 09:47:32.957292 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" event={"ID":"9ca4d989-df3a-4b66-a18f-21dffbc966c8","Type":"ContainerStarted","Data":"8619ed31fc6f94226528a6573a808f71f9e8b1773a375e0622f356d1b86a1601"} Mar 19 09:47:32.970024 master-0 kubenswrapper[15202]: I0319 09:47:32.962901 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" event={"ID":"85a8f0de-e96a-4be5-8980-36532e0fa45c","Type":"ContainerStarted","Data":"5acb80ce2b9ed0659d1a467d34029374bd9e2a181d87fa20e57e1baf57bc13ef"} Mar 19 09:47:32.984213 master-0 kubenswrapper[15202]: I0319 09:47:32.984168 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" event={"ID":"f48bfc98-b4af-4ebe-b96f-ac119a0887d4","Type":"ContainerStarted","Data":"b7fe0624f8a3280a8960034a8cabe1e3e88168d2ef68b47ded0b5e01e02de355"} Mar 19 09:47:32.990037 master-0 kubenswrapper[15202]: I0319 09:47:32.989949 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:32.990201 master-0 kubenswrapper[15202]: I0319 09:47:32.990141 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:32.990309 master-0 kubenswrapper[15202]: E0319 09:47:32.990285 15202 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:47:32.990353 master-0 kubenswrapper[15202]: E0319 09:47:32.990334 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:34.990319392 +0000 UTC m=+1372.375734208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "webhook-server-cert" not found Mar 19 09:47:32.990919 master-0 kubenswrapper[15202]: E0319 09:47:32.990788 15202 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:47:32.990919 master-0 kubenswrapper[15202]: E0319 09:47:32.990858 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:34.990842824 +0000 UTC m=+1372.376257640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "metrics-server-cert" not found Mar 19 09:47:33.068152 master-0 kubenswrapper[15202]: W0319 09:47:33.067737 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f2734d7_dd75_4742_aa62_2adf7d60700e.slice/crio-7f150aceb8c5a28313e81a0031be0ea45c85af6911f2f8cf8c22eee5b92c75b3 WatchSource:0}: Error finding container 7f150aceb8c5a28313e81a0031be0ea45c85af6911f2f8cf8c22eee5b92c75b3: Status 404 returned error can't find the container with id 7f150aceb8c5a28313e81a0031be0ea45c85af6911f2f8cf8c22eee5b92c75b3 Mar 19 09:47:33.109867 master-0 kubenswrapper[15202]: I0319 09:47:33.109583 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-65d6b"] Mar 19 09:47:33.133566 master-0 kubenswrapper[15202]: I0319 09:47:33.132316 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx"] Mar 19 09:47:33.166857 master-0 kubenswrapper[15202]: I0319 09:47:33.166803 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj"] Mar 19 09:47:33.193101 master-0 kubenswrapper[15202]: W0319 09:47:33.193020 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dfce486_4dc5_4001_9a3d_06149d764ea0.slice/crio-020c5931fc395487b0ef0e162524c9e30e75a501fb489b4436cf710c8fe94310 WatchSource:0}: Error finding container 020c5931fc395487b0ef0e162524c9e30e75a501fb489b4436cf710c8fe94310: Status 404 returned error can't find the container with id 020c5931fc395487b0ef0e162524c9e30e75a501fb489b4436cf710c8fe94310 Mar 19 09:47:33.215611 master-0 kubenswrapper[15202]: I0319 09:47:33.215481 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4"] Mar 19 09:47:33.232099 master-0 kubenswrapper[15202]: I0319 09:47:33.232020 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv"] Mar 19 09:47:33.818300 master-0 kubenswrapper[15202]: I0319 09:47:33.818248 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:33.821543 master-0 kubenswrapper[15202]: E0319 09:47:33.820445 15202 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:33.821543 master-0 kubenswrapper[15202]: E0319 09:47:33.820560 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert podName:4e3edf6d-9015-4d51-96e3-3c6bd898c4fe nodeName:}" failed. No retries permitted until 2026-03-19 09:47:37.820539445 +0000 UTC m=+1375.205954261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert") pod "infra-operator-controller-manager-7dd6bb94c9-xmlj9" (UID: "4e3edf6d-9015-4d51-96e3-3c6bd898c4fe") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:34.009057 master-0 kubenswrapper[15202]: I0319 09:47:34.008975 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" event={"ID":"3b73f526-cd94-4119-bcc1-8aa00e58b6ce","Type":"ContainerStarted","Data":"91c12adaddacd3a32bf04dfc00475ac418432548388c1b5562b89ea5755971f8"} Mar 19 09:47:34.010809 master-0 kubenswrapper[15202]: I0319 09:47:34.010779 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" event={"ID":"4bf9e14b-a962-4ea4-be38-0186a20a5da5","Type":"ContainerStarted","Data":"35cb566b80ec60f1d9fe221b44898570c92c364069220574e23fd3b1aab07857"} Mar 19 09:47:34.014161 master-0 kubenswrapper[15202]: I0319 09:47:34.014130 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" event={"ID":"ad41e9ab-a275-4b15-9fcf-5ca0da404d52","Type":"ContainerStarted","Data":"a883090a98136e039efffab1f1e05640688eccd9f9405cf5cd516c8a02d71710"} Mar 19 09:47:34.019082 master-0 kubenswrapper[15202]: I0319 09:47:34.019037 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" event={"ID":"3f2734d7-dd75-4742-aa62-2adf7d60700e","Type":"ContainerStarted","Data":"7f150aceb8c5a28313e81a0031be0ea45c85af6911f2f8cf8c22eee5b92c75b3"} Mar 19 09:47:34.023756 master-0 kubenswrapper[15202]: I0319 09:47:34.023719 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" event={"ID":"6dfce486-4dc5-4001-9a3d-06149d764ea0","Type":"ContainerStarted","Data":"020c5931fc395487b0ef0e162524c9e30e75a501fb489b4436cf710c8fe94310"} Mar 19 09:47:34.764709 master-0 kubenswrapper[15202]: I0319 09:47:34.764639 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:34.765007 master-0 kubenswrapper[15202]: E0319 09:47:34.764753 15202 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:34.765007 master-0 kubenswrapper[15202]: E0319 09:47:34.764891 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert podName:aeb8435e-9818-471c-b779-9e40d7084842 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:38.764862107 +0000 UTC m=+1376.150276923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899dzhg7" (UID: "aeb8435e-9818-471c-b779-9e40d7084842") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:35.078331 master-0 kubenswrapper[15202]: E0319 09:47:35.078189 15202 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:47:35.078331 master-0 kubenswrapper[15202]: E0319 09:47:35.078302 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:39.078282014 +0000 UTC m=+1376.463696830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "metrics-server-cert" not found Mar 19 09:47:35.079038 master-0 kubenswrapper[15202]: I0319 09:47:35.078077 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:35.079242 master-0 kubenswrapper[15202]: I0319 09:47:35.079189 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:35.079387 master-0 kubenswrapper[15202]: E0319 09:47:35.079358 15202 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:47:35.079459 master-0 kubenswrapper[15202]: E0319 09:47:35.079404 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:39.079395322 +0000 UTC m=+1376.464810138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "webhook-server-cert" not found Mar 19 09:47:37.844187 master-0 kubenswrapper[15202]: I0319 09:47:37.844118 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:37.844751 master-0 kubenswrapper[15202]: E0319 09:47:37.844349 15202 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:37.844751 master-0 kubenswrapper[15202]: E0319 09:47:37.844412 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert podName:4e3edf6d-9015-4d51-96e3-3c6bd898c4fe nodeName:}" failed. No retries permitted until 2026-03-19 09:47:45.844393516 +0000 UTC m=+1383.229808332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert") pod "infra-operator-controller-manager-7dd6bb94c9-xmlj9" (UID: "4e3edf6d-9015-4d51-96e3-3c6bd898c4fe") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:47:38.773553 master-0 kubenswrapper[15202]: I0319 09:47:38.773462 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:38.773794 master-0 kubenswrapper[15202]: E0319 09:47:38.773610 15202 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:38.773794 master-0 kubenswrapper[15202]: E0319 09:47:38.773673 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert podName:aeb8435e-9818-471c-b779-9e40d7084842 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:46.773657488 +0000 UTC m=+1384.159072304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899dzhg7" (UID: "aeb8435e-9818-471c-b779-9e40d7084842") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:39.080530 master-0 kubenswrapper[15202]: I0319 09:47:39.080318 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:39.081311 master-0 kubenswrapper[15202]: I0319 09:47:39.080563 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:39.081311 master-0 kubenswrapper[15202]: E0319 09:47:39.080803 15202 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:47:39.081566 master-0 kubenswrapper[15202]: E0319 09:47:39.081523 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:47.081502409 +0000 UTC m=+1384.466917235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "metrics-server-cert" not found Mar 19 09:47:39.081726 master-0 kubenswrapper[15202]: E0319 09:47:39.081703 15202 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:47:39.081802 master-0 kubenswrapper[15202]: E0319 09:47:39.081783 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:47.081733125 +0000 UTC m=+1384.467147941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "webhook-server-cert" not found Mar 19 09:47:45.929946 master-0 kubenswrapper[15202]: I0319 09:47:45.929808 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:45.933640 master-0 kubenswrapper[15202]: I0319 09:47:45.933582 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3edf6d-9015-4d51-96e3-3c6bd898c4fe-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-xmlj9\" (UID: \"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:46.062111 master-0 kubenswrapper[15202]: I0319 09:47:46.062012 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:46.846758 master-0 kubenswrapper[15202]: I0319 09:47:46.846636 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:47:46.846981 master-0 kubenswrapper[15202]: E0319 09:47:46.846809 15202 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:46.846981 master-0 kubenswrapper[15202]: E0319 09:47:46.846866 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert podName:aeb8435e-9818-471c-b779-9e40d7084842 nodeName:}" failed. No retries permitted until 2026-03-19 09:48:02.846851009 +0000 UTC m=+1400.232265825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899dzhg7" (UID: "aeb8435e-9818-471c-b779-9e40d7084842") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:47:47.152050 master-0 kubenswrapper[15202]: I0319 09:47:47.151886 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:47.152050 master-0 kubenswrapper[15202]: I0319 09:47:47.152043 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:47.152996 master-0 kubenswrapper[15202]: E0319 09:47:47.152241 15202 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:47:47.152996 master-0 kubenswrapper[15202]: E0319 09:47:47.152322 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs podName:5c7c160b-03f5-4120-9169-6c15f43bc781 nodeName:}" failed. No retries permitted until 2026-03-19 09:48:03.152300601 +0000 UTC m=+1400.537715427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-8hx4g" (UID: "5c7c160b-03f5-4120-9169-6c15f43bc781") : secret "webhook-server-cert" not found Mar 19 09:47:47.155852 master-0 kubenswrapper[15202]: I0319 09:47:47.155770 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:47:54.277612 master-0 kubenswrapper[15202]: I0319 09:47:54.277538 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9"] Mar 19 09:47:54.350728 master-0 kubenswrapper[15202]: I0319 09:47:54.350617 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" event={"ID":"a74e399f-b504-430a-a7a7-1aa15487157f","Type":"ContainerStarted","Data":"9c608e8e44aa720382abc33242af58cab4218fe30d0c5468b0c39f5656d66761"} Mar 19 09:47:54.351987 master-0 kubenswrapper[15202]: I0319 09:47:54.351874 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:47:54.367698 master-0 kubenswrapper[15202]: I0319 09:47:54.367319 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" event={"ID":"7a63b024-b47d-4e28-b8df-db50a3c95bed","Type":"ContainerStarted","Data":"a84499d3eb2f904935def9771c93fef63cd3038c4bd4530c91526ca720134092"} Mar 19 09:47:54.368450 master-0 kubenswrapper[15202]: I0319 09:47:54.368431 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:54.386658 master-0 kubenswrapper[15202]: I0319 09:47:54.386583 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" event={"ID":"d5b615b7-8989-4ccb-afa3-85f638a5f8f0","Type":"ContainerStarted","Data":"1783814a36fd09b078a06bce2c775394d88e767af8dce7a05bd3ab78dfc14b7f"} Mar 19 09:47:54.386658 master-0 kubenswrapper[15202]: I0319 09:47:54.386627 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:47:54.391994 master-0 kubenswrapper[15202]: I0319 09:47:54.391696 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" event={"ID":"2986b2af-8536-4532-a416-d1092b24cad2","Type":"ContainerStarted","Data":"c8d9cf04e1e2df4a06a379376d68b999487c0a840f66b43fd1717fd1b714345b"} Mar 19 09:47:54.392391 master-0 kubenswrapper[15202]: I0319 09:47:54.392102 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:47:54.401506 master-0 kubenswrapper[15202]: I0319 09:47:54.395853 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" event={"ID":"9af14adc-2a84-4aa0-88a2-2ccc5de52d8a","Type":"ContainerStarted","Data":"1a52dd8246bcb4b2e76e5d82650d428d8299fc1123e1c8c78b9941b33fc08ef1"} Mar 19 09:47:54.401506 master-0 kubenswrapper[15202]: I0319 09:47:54.396804 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:47:54.401506 master-0 kubenswrapper[15202]: I0319 09:47:54.398540 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" event={"ID":"4bf9e14b-a962-4ea4-be38-0186a20a5da5","Type":"ContainerStarted","Data":"f9848b1cbf44bc7d8297abaf5605cab9ee539e0d555f004b57849c33890c07bf"} Mar 19 09:47:54.401506 master-0 kubenswrapper[15202]: I0319 09:47:54.399106 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:47:54.402717 master-0 kubenswrapper[15202]: I0319 09:47:54.402308 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" event={"ID":"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe","Type":"ContainerStarted","Data":"45c7a68763273b80e1c76164be0509053edd95ebde15650f5642b7522631541e"} Mar 19 09:47:54.423487 master-0 kubenswrapper[15202]: I0319 09:47:54.415617 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" podStartSLOduration=4.413633565 podStartE2EDuration="25.415594337s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.755810607 +0000 UTC m=+1370.141225423" lastFinishedPulling="2026-03-19 09:47:53.757771379 +0000 UTC m=+1391.143186195" observedRunningTime="2026-03-19 09:47:54.370093227 +0000 UTC m=+1391.755508043" watchObservedRunningTime="2026-03-19 09:47:54.415594337 +0000 UTC m=+1391.801009153" Mar 19 09:47:54.455500 master-0 kubenswrapper[15202]: I0319 09:47:54.455381 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" podStartSLOduration=2.689571702 podStartE2EDuration="25.455360587s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:31.009737372 +0000 UTC m=+1368.395152188" lastFinishedPulling="2026-03-19 09:47:53.775526257 +0000 UTC m=+1391.160941073" observedRunningTime="2026-03-19 09:47:54.399546842 +0000 UTC m=+1391.784961658" watchObservedRunningTime="2026-03-19 09:47:54.455360587 +0000 UTC m=+1391.840775403" Mar 19 09:47:54.496236 master-0 kubenswrapper[15202]: I0319 09:47:54.494919 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" podStartSLOduration=3.978207464 podStartE2EDuration="25.49490017s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.242107949 +0000 UTC m=+1369.627522765" lastFinishedPulling="2026-03-19 09:47:53.758800635 +0000 UTC m=+1391.144215471" observedRunningTime="2026-03-19 09:47:54.432869752 +0000 UTC m=+1391.818284558" watchObservedRunningTime="2026-03-19 09:47:54.49490017 +0000 UTC m=+1391.880314986" Mar 19 09:47:54.528026 master-0 kubenswrapper[15202]: I0319 09:47:54.524786 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" podStartSLOduration=4.040152568 podStartE2EDuration="25.524766755s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.274263499 +0000 UTC m=+1369.659678315" lastFinishedPulling="2026-03-19 09:47:53.758877686 +0000 UTC m=+1391.144292502" observedRunningTime="2026-03-19 09:47:54.46521646 +0000 UTC m=+1391.850631276" watchObservedRunningTime="2026-03-19 09:47:54.524766755 +0000 UTC m=+1391.910181561" Mar 19 09:47:54.570995 master-0 kubenswrapper[15202]: I0319 09:47:54.548920 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" podStartSLOduration=3.52571801 podStartE2EDuration="25.548902749s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:31.756188683 +0000 UTC m=+1369.141603499" lastFinishedPulling="2026-03-19 09:47:53.779373412 +0000 UTC m=+1391.164788238" observedRunningTime="2026-03-19 09:47:54.504563608 +0000 UTC m=+1391.889978434" watchObservedRunningTime="2026-03-19 09:47:54.548902749 +0000 UTC m=+1391.934317565" Mar 19 09:47:54.588316 master-0 kubenswrapper[15202]: I0319 09:47:54.585838 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" podStartSLOduration=3.820076393 podStartE2EDuration="24.585813439s" podCreationTimestamp="2026-03-19 09:47:30 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.992937226 +0000 UTC m=+1370.378352042" lastFinishedPulling="2026-03-19 09:47:53.758674272 +0000 UTC m=+1391.144089088" observedRunningTime="2026-03-19 09:47:54.534149967 +0000 UTC m=+1391.919564793" watchObservedRunningTime="2026-03-19 09:47:54.585813439 +0000 UTC m=+1391.971228265" Mar 19 09:47:55.428567 master-0 kubenswrapper[15202]: I0319 09:47:55.427599 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" event={"ID":"9ca4d989-df3a-4b66-a18f-21dffbc966c8","Type":"ContainerStarted","Data":"ad297a1ba94df4f1c32360ba6bd0674ec25083aca61bb10778f3e032da9d1c33"} Mar 19 09:47:55.429131 master-0 kubenswrapper[15202]: I0319 09:47:55.428676 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:47:55.457135 master-0 kubenswrapper[15202]: I0319 09:47:55.457070 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" event={"ID":"1e2ae390-5000-498a-ad34-866298c5db8c","Type":"ContainerStarted","Data":"2eab0a9ff8b3ea2520f0fd84ca9d7800b7c79ff81bf6a680bc836f33d0a13d43"} Mar 19 09:47:55.457708 master-0 kubenswrapper[15202]: I0319 09:47:55.457681 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:47:55.470726 master-0 kubenswrapper[15202]: I0319 09:47:55.470654 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" event={"ID":"ad41e9ab-a275-4b15-9fcf-5ca0da404d52","Type":"ContainerStarted","Data":"4c2ac460c894eec6786b80b83fe58c9408f1ac7a2d21d0d5acc22df33534fd8b"} Mar 19 09:47:55.476004 master-0 kubenswrapper[15202]: I0319 09:47:55.475932 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" event={"ID":"3f2734d7-dd75-4742-aa62-2adf7d60700e","Type":"ContainerStarted","Data":"5ccd68d69f4857af0cb12790faccc38ee025889574f953abfaa4abbdd0c5e423"} Mar 19 09:47:55.476235 master-0 kubenswrapper[15202]: I0319 09:47:55.476206 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:47:55.485094 master-0 kubenswrapper[15202]: I0319 09:47:55.484988 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" podStartSLOduration=4.25327742 podStartE2EDuration="25.484968849s" podCreationTimestamp="2026-03-19 09:47:30 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.551162758 +0000 UTC m=+1369.936577574" lastFinishedPulling="2026-03-19 09:47:53.782854187 +0000 UTC m=+1391.168269003" observedRunningTime="2026-03-19 09:47:55.468531124 +0000 UTC m=+1392.853945940" watchObservedRunningTime="2026-03-19 09:47:55.484968849 +0000 UTC m=+1392.870383665" Mar 19 09:47:55.490871 master-0 kubenswrapper[15202]: I0319 09:47:55.490814 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" event={"ID":"e3f266a5-d255-4be9-9205-76f31009a5a5","Type":"ContainerStarted","Data":"b083728c3cb05c33e32e625c1f6125b4e2a3d3861e88eb3a68a8a052f661baeb"} Mar 19 09:47:55.491702 master-0 kubenswrapper[15202]: I0319 09:47:55.491686 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:47:55.500052 master-0 kubenswrapper[15202]: I0319 09:47:55.499992 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" event={"ID":"3b73f526-cd94-4119-bcc1-8aa00e58b6ce","Type":"ContainerStarted","Data":"22a30f6d99058b04f4d328f92b716458646a56d043ca90961d16f0e17517dc6d"} Mar 19 09:47:55.501214 master-0 kubenswrapper[15202]: I0319 09:47:55.501193 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:47:55.503248 master-0 kubenswrapper[15202]: I0319 09:47:55.503200 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" event={"ID":"cbd311bd-f196-4c7c-b694-ad5cd3c64507","Type":"ContainerStarted","Data":"7ffc4f4b89abc290a80a5cc1c8eda1a33157ca87a77427cd0f1e3814fe33cbe2"} Mar 19 09:47:55.507534 master-0 kubenswrapper[15202]: I0319 09:47:55.504536 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:47:55.516493 master-0 kubenswrapper[15202]: I0319 09:47:55.513058 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" event={"ID":"6dfce486-4dc5-4001-9a3d-06149d764ea0","Type":"ContainerStarted","Data":"7b3f76ecf32dfad4c8c1c0a0e4e55024665e9d44d5b242206f6dd0a84f9e38d3"} Mar 19 09:47:55.516493 master-0 kubenswrapper[15202]: I0319 09:47:55.513895 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:47:55.529138 master-0 kubenswrapper[15202]: I0319 09:47:55.529000 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" event={"ID":"4251dd77-2fed-4b65-a847-6f1dfa2ac07b","Type":"ContainerStarted","Data":"69660df4e08dde98297a7b5dada7a192df3f48130570aa57111d80a370e29c99"} Mar 19 09:47:55.529564 master-0 kubenswrapper[15202]: I0319 09:47:55.529538 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:47:55.537556 master-0 kubenswrapper[15202]: I0319 09:47:55.534948 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" event={"ID":"b2c05fc4-1191-4f97-bd50-fa0decbafbc5","Type":"ContainerStarted","Data":"a7ade41a8334ba13ddc4ac8c84950824dd29162792d88a3eeeccfa93bbb01e78"} Mar 19 09:47:55.537556 master-0 kubenswrapper[15202]: I0319 09:47:55.535933 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:47:55.537556 master-0 kubenswrapper[15202]: I0319 09:47:55.537218 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" event={"ID":"85a8f0de-e96a-4be5-8980-36532e0fa45c","Type":"ContainerStarted","Data":"fd6b6429947b2c563166c5c88d67db7978fb6461e038ae64af1d25c6333698b7"} Mar 19 09:47:55.537816 master-0 kubenswrapper[15202]: I0319 09:47:55.537629 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:47:55.545185 master-0 kubenswrapper[15202]: I0319 09:47:55.542744 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" event={"ID":"f48bfc98-b4af-4ebe-b96f-ac119a0887d4","Type":"ContainerStarted","Data":"dacda4ce082d50c1b7961c37f19051ce1cb8b0474f7d67d1c98acb0a1fd588e7"} Mar 19 09:47:55.545185 master-0 kubenswrapper[15202]: I0319 09:47:55.543397 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:47:55.545185 master-0 kubenswrapper[15202]: I0319 09:47:55.544461 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" event={"ID":"2fefcbf8-dda4-48d9-b725-4dec37f3bfd9","Type":"ContainerStarted","Data":"a71c17651ad3be31652035ade1281239f8cd9eaadd054d0af80077d0961a5d14"} Mar 19 09:47:55.545185 master-0 kubenswrapper[15202]: I0319 09:47:55.544906 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:47:55.567030 master-0 kubenswrapper[15202]: I0319 09:47:55.566977 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" event={"ID":"da0ad24d-f0dc-43fb-939e-0fa4c84473f4","Type":"ContainerStarted","Data":"7901371878ccdf9a5ea7f55009fdd443ee59b0e7b4f15f8dba9fb3487ef2479d"} Mar 19 09:47:55.567300 master-0 kubenswrapper[15202]: I0319 09:47:55.567044 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:47:55.605557 master-0 kubenswrapper[15202]: I0319 09:47:55.605456 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" podStartSLOduration=3.653626112 podStartE2EDuration="26.605437655s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:30.805842963 +0000 UTC m=+1368.191257779" lastFinishedPulling="2026-03-19 09:47:53.757654506 +0000 UTC m=+1391.143069322" observedRunningTime="2026-03-19 09:47:55.601941979 +0000 UTC m=+1392.987356805" watchObservedRunningTime="2026-03-19 09:47:55.605437655 +0000 UTC m=+1392.990852481" Mar 19 09:47:56.157033 master-0 kubenswrapper[15202]: I0319 09:47:56.154379 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" podStartSLOduration=5.504422664 podStartE2EDuration="27.154347851s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.109343199 +0000 UTC m=+1369.494758015" lastFinishedPulling="2026-03-19 09:47:53.759268386 +0000 UTC m=+1391.144683202" observedRunningTime="2026-03-19 09:47:56.150463325 +0000 UTC m=+1393.535878151" watchObservedRunningTime="2026-03-19 09:47:56.154347851 +0000 UTC m=+1393.539762667" Mar 19 09:47:57.156226 master-0 kubenswrapper[15202]: I0319 09:47:57.156028 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbgx4" podStartSLOduration=6.452467642 podStartE2EDuration="27.155998245s" podCreationTimestamp="2026-03-19 09:47:30 +0000 UTC" firstStartedPulling="2026-03-19 09:47:33.170614131 +0000 UTC m=+1370.556028947" lastFinishedPulling="2026-03-19 09:47:53.874144734 +0000 UTC m=+1391.259559550" observedRunningTime="2026-03-19 09:47:57.144290507 +0000 UTC m=+1394.529705333" watchObservedRunningTime="2026-03-19 09:47:57.155998245 +0000 UTC m=+1394.541413061" Mar 19 09:47:57.625497 master-0 kubenswrapper[15202]: I0319 09:47:57.624767 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" podStartSLOduration=7.589239579 podStartE2EDuration="28.624744127s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.743764521 +0000 UTC m=+1370.129179327" lastFinishedPulling="2026-03-19 09:47:53.779269059 +0000 UTC m=+1391.164683875" observedRunningTime="2026-03-19 09:47:57.60251857 +0000 UTC m=+1394.987933406" watchObservedRunningTime="2026-03-19 09:47:57.624744127 +0000 UTC m=+1395.010158973" Mar 19 09:47:57.694175 master-0 kubenswrapper[15202]: I0319 09:47:57.692740 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" podStartSLOduration=7.041969077 podStartE2EDuration="27.692721151s" podCreationTimestamp="2026-03-19 09:47:30 +0000 UTC" firstStartedPulling="2026-03-19 09:47:33.109161798 +0000 UTC m=+1370.494576614" lastFinishedPulling="2026-03-19 09:47:53.759913872 +0000 UTC m=+1391.145328688" observedRunningTime="2026-03-19 09:47:57.642917715 +0000 UTC m=+1395.028332531" watchObservedRunningTime="2026-03-19 09:47:57.692721151 +0000 UTC m=+1395.078135967" Mar 19 09:47:57.698105 master-0 kubenswrapper[15202]: I0319 09:47:57.698037 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" podStartSLOduration=7.976372201 podStartE2EDuration="28.698018751s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:33.037815661 +0000 UTC m=+1370.423230477" lastFinishedPulling="2026-03-19 09:47:53.759462211 +0000 UTC m=+1391.144877027" observedRunningTime="2026-03-19 09:47:57.693275865 +0000 UTC m=+1395.078690681" watchObservedRunningTime="2026-03-19 09:47:57.698018751 +0000 UTC m=+1395.083433567" Mar 19 09:47:57.728675 master-0 kubenswrapper[15202]: I0319 09:47:57.727761 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" podStartSLOduration=6.242410227 podStartE2EDuration="28.727738414s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:31.273377395 +0000 UTC m=+1368.658792211" lastFinishedPulling="2026-03-19 09:47:53.758705582 +0000 UTC m=+1391.144120398" observedRunningTime="2026-03-19 09:47:57.716582569 +0000 UTC m=+1395.101997385" watchObservedRunningTime="2026-03-19 09:47:57.727738414 +0000 UTC m=+1395.113153230" Mar 19 09:47:57.746577 master-0 kubenswrapper[15202]: I0319 09:47:57.746246 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" podStartSLOduration=7.269848485 podStartE2EDuration="28.746222809s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.282362609 +0000 UTC m=+1369.667777425" lastFinishedPulling="2026-03-19 09:47:53.758736933 +0000 UTC m=+1391.144151749" observedRunningTime="2026-03-19 09:47:57.74262098 +0000 UTC m=+1395.128035796" watchObservedRunningTime="2026-03-19 09:47:57.746222809 +0000 UTC m=+1395.131637625" Mar 19 09:47:57.781125 master-0 kubenswrapper[15202]: I0319 09:47:57.781023 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" podStartSLOduration=7.195903318 podStartE2EDuration="27.780997515s" podCreationTimestamp="2026-03-19 09:47:30 +0000 UTC" firstStartedPulling="2026-03-19 09:47:33.198518059 +0000 UTC m=+1370.583932875" lastFinishedPulling="2026-03-19 09:47:53.783612256 +0000 UTC m=+1391.169027072" observedRunningTime="2026-03-19 09:47:57.77507848 +0000 UTC m=+1395.160493296" watchObservedRunningTime="2026-03-19 09:47:57.780997515 +0000 UTC m=+1395.166412331" Mar 19 09:47:57.819696 master-0 kubenswrapper[15202]: I0319 09:47:57.819590 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" podStartSLOduration=6.262206552 podStartE2EDuration="28.819572474s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:31.275316052 +0000 UTC m=+1368.660730868" lastFinishedPulling="2026-03-19 09:47:53.832681974 +0000 UTC m=+1391.218096790" observedRunningTime="2026-03-19 09:47:57.812863699 +0000 UTC m=+1395.198278525" watchObservedRunningTime="2026-03-19 09:47:57.819572474 +0000 UTC m=+1395.204987290" Mar 19 09:47:57.840794 master-0 kubenswrapper[15202]: I0319 09:47:57.839441 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" podStartSLOduration=7.194437838 podStartE2EDuration="28.839421003s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.114308201 +0000 UTC m=+1369.499723027" lastFinishedPulling="2026-03-19 09:47:53.759291376 +0000 UTC m=+1391.144706192" observedRunningTime="2026-03-19 09:47:57.837594138 +0000 UTC m=+1395.223008954" watchObservedRunningTime="2026-03-19 09:47:57.839421003 +0000 UTC m=+1395.224835819" Mar 19 09:47:57.879622 master-0 kubenswrapper[15202]: I0319 09:47:57.878448 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" podStartSLOduration=7.145457912 podStartE2EDuration="28.878428554s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:32.046484991 +0000 UTC m=+1369.431899807" lastFinishedPulling="2026-03-19 09:47:53.779455633 +0000 UTC m=+1391.164870449" observedRunningTime="2026-03-19 09:47:57.874491267 +0000 UTC m=+1395.259906083" watchObservedRunningTime="2026-03-19 09:47:57.878428554 +0000 UTC m=+1395.263843370" Mar 19 09:47:57.909999 master-0 kubenswrapper[15202]: I0319 09:47:57.909831 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" podStartSLOduration=6.9378613 podStartE2EDuration="28.909811607s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:31.8064238 +0000 UTC m=+1369.191838616" lastFinishedPulling="2026-03-19 09:47:53.778374107 +0000 UTC m=+1391.163788923" observedRunningTime="2026-03-19 09:47:57.90387337 +0000 UTC m=+1395.289288196" watchObservedRunningTime="2026-03-19 09:47:57.909811607 +0000 UTC m=+1395.295226413" Mar 19 09:47:59.624562 master-0 kubenswrapper[15202]: I0319 09:47:59.624160 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" event={"ID":"4e3edf6d-9015-4d51-96e3-3c6bd898c4fe","Type":"ContainerStarted","Data":"11bad97ef7fa60498fc5e3ed9ade7eed52f8774d5438f56e33cbd3bad1fbf1d1"} Mar 19 09:47:59.625973 master-0 kubenswrapper[15202]: I0319 09:47:59.625907 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:47:59.654771 master-0 kubenswrapper[15202]: I0319 09:47:59.654631 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" podStartSLOduration=26.467813336 podStartE2EDuration="30.654597609s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:47:54.295617583 +0000 UTC m=+1391.681032399" lastFinishedPulling="2026-03-19 09:47:58.482401856 +0000 UTC m=+1395.867816672" observedRunningTime="2026-03-19 09:47:59.643690041 +0000 UTC m=+1397.029104847" watchObservedRunningTime="2026-03-19 09:47:59.654597609 +0000 UTC m=+1397.040012425" Mar 19 09:47:59.911190 master-0 kubenswrapper[15202]: I0319 09:47:59.911059 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-j929h" Mar 19 09:47:59.940028 master-0 kubenswrapper[15202]: I0319 09:47:59.939974 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zvf6m" Mar 19 09:48:00.054984 master-0 kubenswrapper[15202]: I0319 09:48:00.054916 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lmp5n" Mar 19 09:48:00.166457 master-0 kubenswrapper[15202]: I0319 09:48:00.166344 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq7cg" Mar 19 09:48:00.211488 master-0 kubenswrapper[15202]: I0319 09:48:00.211403 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-ft2mk" Mar 19 09:48:00.294936 master-0 kubenswrapper[15202]: I0319 09:48:00.294893 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-b8s4c" Mar 19 09:48:00.511513 master-0 kubenswrapper[15202]: I0319 09:48:00.511408 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qlfpx" Mar 19 09:48:00.532991 master-0 kubenswrapper[15202]: I0319 09:48:00.532909 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-kh9rb" Mar 19 09:48:00.634407 master-0 kubenswrapper[15202]: I0319 09:48:00.634334 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6n7n9" Mar 19 09:48:01.011749 master-0 kubenswrapper[15202]: I0319 09:48:01.011700 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-r78pl" Mar 19 09:48:01.045763 master-0 kubenswrapper[15202]: I0319 09:48:01.045714 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-s5trr" Mar 19 09:48:01.099030 master-0 kubenswrapper[15202]: I0319 09:48:01.094906 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-pw2xk" Mar 19 09:48:01.268812 master-0 kubenswrapper[15202]: I0319 09:48:01.268621 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-7fq2b" Mar 19 09:48:01.283063 master-0 kubenswrapper[15202]: I0319 09:48:01.282989 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-jv72h" Mar 19 09:48:01.356638 master-0 kubenswrapper[15202]: I0319 09:48:01.354714 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4tjlx" Mar 19 09:48:01.372160 master-0 kubenswrapper[15202]: I0319 09:48:01.372095 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-j5ggz" Mar 19 09:48:01.404098 master-0 kubenswrapper[15202]: I0319 09:48:01.404038 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-65d6b" Mar 19 09:48:01.414868 master-0 kubenswrapper[15202]: I0319 09:48:01.414827 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-5znsj" Mar 19 09:48:01.726529 master-0 kubenswrapper[15202]: I0319 09:48:01.726430 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2pmjv" Mar 19 09:48:02.881604 master-0 kubenswrapper[15202]: I0319 09:48:02.881515 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:48:02.885786 master-0 kubenswrapper[15202]: I0319 09:48:02.885732 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aeb8435e-9818-471c-b779-9e40d7084842-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899dzhg7\" (UID: \"aeb8435e-9818-471c-b779-9e40d7084842\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:48:03.110823 master-0 kubenswrapper[15202]: I0319 09:48:03.110648 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:48:03.187862 master-0 kubenswrapper[15202]: I0319 09:48:03.187749 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:48:03.191937 master-0 kubenswrapper[15202]: I0319 09:48:03.191877 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c7c160b-03f5-4120-9169-6c15f43bc781-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-8hx4g\" (UID: \"5c7c160b-03f5-4120-9169-6c15f43bc781\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:48:03.242253 master-0 kubenswrapper[15202]: I0319 09:48:03.242153 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:48:04.015651 master-0 kubenswrapper[15202]: I0319 09:48:04.015568 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7"] Mar 19 09:48:04.030450 master-0 kubenswrapper[15202]: I0319 09:48:04.030361 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g"] Mar 19 09:48:04.675768 master-0 kubenswrapper[15202]: I0319 09:48:04.675692 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" event={"ID":"aeb8435e-9818-471c-b779-9e40d7084842","Type":"ContainerStarted","Data":"ac95599c039bc0ce907e533a3c7a58997c1851219a1fc6263a3c55b889eedf40"} Mar 19 09:48:04.677988 master-0 kubenswrapper[15202]: I0319 09:48:04.677951 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" event={"ID":"5c7c160b-03f5-4120-9169-6c15f43bc781","Type":"ContainerStarted","Data":"d0cc0b8680484c24e60135bf2a449cc1c3089e3ed4936ef7dc9bbc314e8bb476"} Mar 19 09:48:04.677988 master-0 kubenswrapper[15202]: I0319 09:48:04.677980 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" event={"ID":"5c7c160b-03f5-4120-9169-6c15f43bc781","Type":"ContainerStarted","Data":"a6ff881485c6abcdcc723db42b08be9e2c7634bfb0d5fba02ae233249d987a24"} Mar 19 09:48:04.678940 master-0 kubenswrapper[15202]: I0319 09:48:04.678910 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:48:04.731277 master-0 kubenswrapper[15202]: I0319 09:48:04.731170 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" podStartSLOduration=34.731142541 podStartE2EDuration="34.731142541s" podCreationTimestamp="2026-03-19 09:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:04.710695378 +0000 UTC m=+1402.096110204" watchObservedRunningTime="2026-03-19 09:48:04.731142541 +0000 UTC m=+1402.116557387" Mar 19 09:48:06.069574 master-0 kubenswrapper[15202]: I0319 09:48:06.069507 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-xmlj9" Mar 19 09:48:06.713484 master-0 kubenswrapper[15202]: I0319 09:48:06.713387 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" event={"ID":"aeb8435e-9818-471c-b779-9e40d7084842","Type":"ContainerStarted","Data":"b1feb64f1aaa671ecc19d61dbee73d877038e53361ddc5d33497b2976f340f6b"} Mar 19 09:48:06.759554 master-0 kubenswrapper[15202]: I0319 09:48:06.759447 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" podStartSLOduration=35.739425444 podStartE2EDuration="37.759420894s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="2026-03-19 09:48:04.01756116 +0000 UTC m=+1401.402975976" lastFinishedPulling="2026-03-19 09:48:06.03755661 +0000 UTC m=+1403.422971426" observedRunningTime="2026-03-19 09:48:06.746512176 +0000 UTC m=+1404.131927032" watchObservedRunningTime="2026-03-19 09:48:06.759420894 +0000 UTC m=+1404.144835710" Mar 19 09:48:07.728225 master-0 kubenswrapper[15202]: I0319 09:48:07.728141 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:48:13.117553 master-0 kubenswrapper[15202]: I0319 09:48:13.117457 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899dzhg7" Mar 19 09:48:13.283497 master-0 kubenswrapper[15202]: I0319 09:48:13.282515 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-8hx4g" Mar 19 09:48:55.552587 master-0 kubenswrapper[15202]: I0319 09:48:55.550513 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cdfrk"] Mar 19 09:48:55.552587 master-0 kubenswrapper[15202]: I0319 09:48:55.552310 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.555498 master-0 kubenswrapper[15202]: I0319 09:48:55.554678 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 09:48:55.555498 master-0 kubenswrapper[15202]: I0319 09:48:55.554846 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 09:48:55.555498 master-0 kubenswrapper[15202]: I0319 09:48:55.555023 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 09:48:55.565004 master-0 kubenswrapper[15202]: I0319 09:48:55.563754 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cdfrk"] Mar 19 09:48:55.591101 master-0 kubenswrapper[15202]: I0319 09:48:55.591048 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-6bm4q"] Mar 19 09:48:55.592678 master-0 kubenswrapper[15202]: I0319 09:48:55.592624 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.594772 master-0 kubenswrapper[15202]: I0319 09:48:55.594509 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 09:48:55.606259 master-0 kubenswrapper[15202]: I0319 09:48:55.606163 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-6bm4q"] Mar 19 09:48:55.610709 master-0 kubenswrapper[15202]: I0319 09:48:55.610671 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd576b4-a565-472e-a6a4-7c14e86f9458-config\") pod \"dnsmasq-dns-685c76cf85-cdfrk\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.610827 master-0 kubenswrapper[15202]: I0319 09:48:55.610723 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpw7x\" (UniqueName: \"kubernetes.io/projected/ddd576b4-a565-472e-a6a4-7c14e86f9458-kube-api-access-kpw7x\") pod \"dnsmasq-dns-685c76cf85-cdfrk\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.610827 master-0 kubenswrapper[15202]: I0319 09:48:55.610767 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-config\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.610827 master-0 kubenswrapper[15202]: I0319 09:48:55.610788 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjxbh\" (UniqueName: \"kubernetes.io/projected/dd3e6a54-c98e-4598-972d-2d1ab10797db-kube-api-access-zjxbh\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.610967 master-0 kubenswrapper[15202]: I0319 09:48:55.610943 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.712616 master-0 kubenswrapper[15202]: I0319 09:48:55.712522 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpw7x\" (UniqueName: \"kubernetes.io/projected/ddd576b4-a565-472e-a6a4-7c14e86f9458-kube-api-access-kpw7x\") pod \"dnsmasq-dns-685c76cf85-cdfrk\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.712874 master-0 kubenswrapper[15202]: I0319 09:48:55.712652 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-config\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.712874 master-0 kubenswrapper[15202]: I0319 09:48:55.712719 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxbh\" (UniqueName: \"kubernetes.io/projected/dd3e6a54-c98e-4598-972d-2d1ab10797db-kube-api-access-zjxbh\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.715463 master-0 kubenswrapper[15202]: I0319 09:48:55.715416 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-config\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.716180 master-0 kubenswrapper[15202]: I0319 09:48:55.716087 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.716379 master-0 kubenswrapper[15202]: I0319 09:48:55.716234 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd576b4-a565-472e-a6a4-7c14e86f9458-config\") pod \"dnsmasq-dns-685c76cf85-cdfrk\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.717426 master-0 kubenswrapper[15202]: I0319 09:48:55.717388 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.718016 master-0 kubenswrapper[15202]: I0319 09:48:55.717985 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd576b4-a565-472e-a6a4-7c14e86f9458-config\") pod \"dnsmasq-dns-685c76cf85-cdfrk\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.730760 master-0 kubenswrapper[15202]: I0319 09:48:55.730705 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxbh\" (UniqueName: \"kubernetes.io/projected/dd3e6a54-c98e-4598-972d-2d1ab10797db-kube-api-access-zjxbh\") pod \"dnsmasq-dns-8476fd89bc-6bm4q\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:55.732537 master-0 kubenswrapper[15202]: I0319 09:48:55.732452 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpw7x\" (UniqueName: \"kubernetes.io/projected/ddd576b4-a565-472e-a6a4-7c14e86f9458-kube-api-access-kpw7x\") pod \"dnsmasq-dns-685c76cf85-cdfrk\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.920276 master-0 kubenswrapper[15202]: I0319 09:48:55.920134 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:48:55.953986 master-0 kubenswrapper[15202]: I0319 09:48:55.953926 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:48:56.261499 master-0 kubenswrapper[15202]: I0319 09:48:56.257520 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cdfrk"] Mar 19 09:48:56.315491 master-0 kubenswrapper[15202]: I0319 09:48:56.312592 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76849d6659-8tphm"] Mar 19 09:48:56.315491 master-0 kubenswrapper[15202]: I0319 09:48:56.314103 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.329548 master-0 kubenswrapper[15202]: I0319 09:48:56.328530 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-dns-svc\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.329548 master-0 kubenswrapper[15202]: I0319 09:48:56.328610 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-config\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.329548 master-0 kubenswrapper[15202]: I0319 09:48:56.328633 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rb82\" (UniqueName: \"kubernetes.io/projected/784cfe54-5ee5-4c81-a106-d785d5803e58-kube-api-access-9rb82\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.342781 master-0 kubenswrapper[15202]: I0319 09:48:56.342738 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-8tphm"] Mar 19 09:48:56.438127 master-0 kubenswrapper[15202]: I0319 09:48:56.435148 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-dns-svc\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.438127 master-0 kubenswrapper[15202]: I0319 09:48:56.435278 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-config\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.438127 master-0 kubenswrapper[15202]: I0319 09:48:56.435303 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rb82\" (UniqueName: \"kubernetes.io/projected/784cfe54-5ee5-4c81-a106-d785d5803e58-kube-api-access-9rb82\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.441871 master-0 kubenswrapper[15202]: I0319 09:48:56.440775 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-dns-svc\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.447855 master-0 kubenswrapper[15202]: I0319 09:48:56.447751 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-config\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.457637 master-0 kubenswrapper[15202]: I0319 09:48:56.457435 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rb82\" (UniqueName: \"kubernetes.io/projected/784cfe54-5ee5-4c81-a106-d785d5803e58-kube-api-access-9rb82\") pod \"dnsmasq-dns-76849d6659-8tphm\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.528396 master-0 kubenswrapper[15202]: I0319 09:48:56.527714 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cdfrk"] Mar 19 09:48:56.534265 master-0 kubenswrapper[15202]: W0319 09:48:56.534210 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd576b4_a565_472e_a6a4_7c14e86f9458.slice/crio-d0779330e323b142507d22f2b4cd2519b8531c9c5272086f2cc2819a73cbb1af WatchSource:0}: Error finding container d0779330e323b142507d22f2b4cd2519b8531c9c5272086f2cc2819a73cbb1af: Status 404 returned error can't find the container with id d0779330e323b142507d22f2b4cd2519b8531c9c5272086f2cc2819a73cbb1af Mar 19 09:48:56.647381 master-0 kubenswrapper[15202]: I0319 09:48:56.642025 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:48:56.682079 master-0 kubenswrapper[15202]: I0319 09:48:56.682011 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-6bm4q"] Mar 19 09:48:56.844493 master-0 kubenswrapper[15202]: I0319 09:48:56.843341 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" event={"ID":"dd3e6a54-c98e-4598-972d-2d1ab10797db","Type":"ContainerStarted","Data":"bc909673f398054c9b8d92bce53a62f4b4ce1e712c5a891f2d7b870c000b836d"} Mar 19 09:48:56.848918 master-0 kubenswrapper[15202]: I0319 09:48:56.847057 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" event={"ID":"ddd576b4-a565-472e-a6a4-7c14e86f9458","Type":"ContainerStarted","Data":"d0779330e323b142507d22f2b4cd2519b8531c9c5272086f2cc2819a73cbb1af"} Mar 19 09:48:56.882220 master-0 kubenswrapper[15202]: I0319 09:48:56.882140 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-6bm4q"] Mar 19 09:48:56.935810 master-0 kubenswrapper[15202]: I0319 09:48:56.934738 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4"] Mar 19 09:48:56.936432 master-0 kubenswrapper[15202]: I0319 09:48:56.936355 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:56.948726 master-0 kubenswrapper[15202]: I0319 09:48:56.948663 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4"] Mar 19 09:48:56.949373 master-0 kubenswrapper[15202]: I0319 09:48:56.949332 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-config\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:56.949453 master-0 kubenswrapper[15202]: I0319 09:48:56.949426 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmscr\" (UniqueName: \"kubernetes.io/projected/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-kube-api-access-lmscr\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:56.949521 master-0 kubenswrapper[15202]: I0319 09:48:56.949491 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.054116 master-0 kubenswrapper[15202]: I0319 09:48:57.053763 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmscr\" (UniqueName: \"kubernetes.io/projected/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-kube-api-access-lmscr\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.054116 master-0 kubenswrapper[15202]: I0319 09:48:57.053842 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.054116 master-0 kubenswrapper[15202]: I0319 09:48:57.053958 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-config\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.055087 master-0 kubenswrapper[15202]: I0319 09:48:57.055060 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-config\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.056054 master-0 kubenswrapper[15202]: I0319 09:48:57.056019 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.076328 master-0 kubenswrapper[15202]: I0319 09:48:57.076279 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmscr\" (UniqueName: \"kubernetes.io/projected/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-kube-api-access-lmscr\") pod \"dnsmasq-dns-6ff8fd9d5c-qk9z4\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.219431 master-0 kubenswrapper[15202]: I0319 09:48:57.219392 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-8tphm"] Mar 19 09:48:57.265284 master-0 kubenswrapper[15202]: I0319 09:48:57.265218 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:48:57.869602 master-0 kubenswrapper[15202]: I0319 09:48:57.869547 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-8tphm" event={"ID":"784cfe54-5ee5-4c81-a106-d785d5803e58","Type":"ContainerStarted","Data":"4a8d780311e93fc4f04922e55d522df7cb06ca95982e7ab7f4f576b777d87741"} Mar 19 09:48:57.935604 master-0 kubenswrapper[15202]: I0319 09:48:57.935517 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4"] Mar 19 09:48:58.947145 master-0 kubenswrapper[15202]: I0319 09:48:58.947078 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" event={"ID":"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7","Type":"ContainerStarted","Data":"251d4d1ac0805629bd519260b46532693cf3c202cc96f928e2dbb3873446eee7"} Mar 19 09:49:00.504915 master-0 kubenswrapper[15202]: I0319 09:49:00.504321 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:49:00.506662 master-0 kubenswrapper[15202]: I0319 09:49:00.506623 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.509387 master-0 kubenswrapper[15202]: I0319 09:49:00.509335 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 09:49:00.509887 master-0 kubenswrapper[15202]: I0319 09:49:00.509867 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 09:49:00.509996 master-0 kubenswrapper[15202]: I0319 09:49:00.509987 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 09:49:00.513226 master-0 kubenswrapper[15202]: I0319 09:49:00.513161 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 09:49:00.513575 master-0 kubenswrapper[15202]: I0319 09:49:00.513536 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 09:49:00.513666 master-0 kubenswrapper[15202]: I0319 09:49:00.513590 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 09:49:00.583115 master-0 kubenswrapper[15202]: I0319 09:49:00.583009 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:49:00.635354 master-0 kubenswrapper[15202]: I0319 09:49:00.635208 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.635354 master-0 kubenswrapper[15202]: I0319 09:49:00.635298 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-958cfe64-d1d3-4ec7-a3d8-81cbd46a10b2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^59a1f797-a1db-4e8c-806d-60dedc1586cc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.635695 master-0 kubenswrapper[15202]: I0319 09:49:00.635638 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wb6\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-kube-api-access-w4wb6\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.635695 master-0 kubenswrapper[15202]: I0319 09:49:00.635678 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.635784 master-0 kubenswrapper[15202]: I0319 09:49:00.635694 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.635784 master-0 kubenswrapper[15202]: I0319 09:49:00.635723 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.636039 master-0 kubenswrapper[15202]: I0319 09:49:00.635995 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.636223 master-0 kubenswrapper[15202]: I0319 09:49:00.636184 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.636288 master-0 kubenswrapper[15202]: I0319 09:49:00.636235 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.636417 master-0 kubenswrapper[15202]: I0319 09:49:00.636388 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.636513 master-0 kubenswrapper[15202]: I0319 09:49:00.636420 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.762275 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.762599 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.762664 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.762907 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.762941 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.763382 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.763653 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-958cfe64-d1d3-4ec7-a3d8-81cbd46a10b2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^59a1f797-a1db-4e8c-806d-60dedc1586cc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.763695 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wb6\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-kube-api-access-w4wb6\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.763738 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.763760 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.764065 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.768802 master-0 kubenswrapper[15202]: I0319 09:49:00.766273 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.771565 master-0 kubenswrapper[15202]: I0319 09:49:00.770624 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.774382 master-0 kubenswrapper[15202]: I0319 09:49:00.772215 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.774382 master-0 kubenswrapper[15202]: I0319 09:49:00.773015 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.780813 master-0 kubenswrapper[15202]: I0319 09:49:00.776294 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.780813 master-0 kubenswrapper[15202]: I0319 09:49:00.779402 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.804794 master-0 kubenswrapper[15202]: I0319 09:49:00.786912 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:00.804794 master-0 kubenswrapper[15202]: I0319 09:49:00.786973 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-958cfe64-d1d3-4ec7-a3d8-81cbd46a10b2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^59a1f797-a1db-4e8c-806d-60dedc1586cc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/1969a7c4ec9215056c50bd84af20fd8a869a5ff9f240ec4f35b2c3a8bf98c2cf/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.804794 master-0 kubenswrapper[15202]: I0319 09:49:00.790120 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.804794 master-0 kubenswrapper[15202]: I0319 09:49:00.801387 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.815373 master-0 kubenswrapper[15202]: I0319 09:49:00.811104 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:00.835588 master-0 kubenswrapper[15202]: I0319 09:49:00.829039 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wb6\" (UniqueName: \"kubernetes.io/projected/9bab9d65-06f1-4b08-aa8c-5f12e7d06183-kube-api-access-w4wb6\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:01.099305 master-0 kubenswrapper[15202]: I0319 09:49:01.098995 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:49:01.101065 master-0 kubenswrapper[15202]: I0319 09:49:01.100947 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.109819 master-0 kubenswrapper[15202]: I0319 09:49:01.106530 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 09:49:01.112744 master-0 kubenswrapper[15202]: I0319 09:49:01.112449 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 09:49:01.112744 master-0 kubenswrapper[15202]: I0319 09:49:01.112561 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 09:49:01.112949 master-0 kubenswrapper[15202]: I0319 09:49:01.112908 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 09:49:01.112999 master-0 kubenswrapper[15202]: I0319 09:49:01.112979 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 09:49:01.112999 master-0 kubenswrapper[15202]: I0319 09:49:01.112989 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195203 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6j5\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-kube-api-access-vq6j5\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195371 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195436 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195530 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3fd58476-e6c9-4799-b98f-2b7147237a93\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ccef839e-d1c4-49fc-92f0-33a894a197b1\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195693 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195778 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67fbe9e8-1121-4091-954c-c6a620d98528-pod-info\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.202675 master-0 kubenswrapper[15202]: I0319 09:49:01.195815 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-config-data\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.208216 master-0 kubenswrapper[15202]: I0319 09:49:01.195884 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-server-conf\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.208216 master-0 kubenswrapper[15202]: I0319 09:49:01.204405 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67fbe9e8-1121-4091-954c-c6a620d98528-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.208216 master-0 kubenswrapper[15202]: I0319 09:49:01.204563 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.208216 master-0 kubenswrapper[15202]: I0319 09:49:01.204586 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.208216 master-0 kubenswrapper[15202]: I0319 09:49:01.208058 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:49:01.229024 master-0 kubenswrapper[15202]: I0319 09:49:01.228796 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 09:49:01.243259 master-0 kubenswrapper[15202]: I0319 09:49:01.241445 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 09:49:01.243259 master-0 kubenswrapper[15202]: I0319 09:49:01.242490 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 09:49:01.246069 master-0 kubenswrapper[15202]: I0319 09:49:01.246045 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 09:49:01.251175 master-0 kubenswrapper[15202]: I0319 09:49:01.251143 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 09:49:01.258652 master-0 kubenswrapper[15202]: I0319 09:49:01.258596 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 09:49:01.308605 master-0 kubenswrapper[15202]: I0319 09:49:01.308493 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-kolla-config\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.308605 master-0 kubenswrapper[15202]: I0319 09:49:01.308570 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-config-data\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308605 master-0 kubenswrapper[15202]: I0319 09:49:01.308621 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-server-conf\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308657 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gzb\" (UniqueName: \"kubernetes.io/projected/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-kube-api-access-55gzb\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308693 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67fbe9e8-1121-4091-954c-c6a620d98528-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308741 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308763 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308799 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308835 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6j5\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-kube-api-access-vq6j5\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308860 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308893 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.308926 master-0 kubenswrapper[15202]: I0319 09:49:01.308920 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.309309 master-0 kubenswrapper[15202]: I0319 09:49:01.308952 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3fd58476-e6c9-4799-b98f-2b7147237a93\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ccef839e-d1c4-49fc-92f0-33a894a197b1\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.309309 master-0 kubenswrapper[15202]: I0319 09:49:01.308979 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-config-data\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.309309 master-0 kubenswrapper[15202]: I0319 09:49:01.309014 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.309309 master-0 kubenswrapper[15202]: I0319 09:49:01.309036 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67fbe9e8-1121-4091-954c-c6a620d98528-pod-info\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.317800 master-0 kubenswrapper[15202]: I0319 09:49:01.316781 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.318865 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-server-conf\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.319524 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67fbe9e8-1121-4091-954c-c6a620d98528-pod-info\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.320441 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67fbe9e8-1121-4091-954c-c6a620d98528-config-data\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.321087 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.321605 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.324215 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.324755 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.324780 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3fd58476-e6c9-4799-b98f-2b7147237a93\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ccef839e-d1c4-49fc-92f0-33a894a197b1\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/48e9fc3eaa0947ed00cbc39a8ec556da81601644c0f562ace217adf672da55f9/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.329363 master-0 kubenswrapper[15202]: I0319 09:49:01.327873 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67fbe9e8-1121-4091-954c-c6a620d98528-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.339683 master-0 kubenswrapper[15202]: I0319 09:49:01.339220 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.354146 master-0 kubenswrapper[15202]: I0319 09:49:01.353901 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6j5\" (UniqueName: \"kubernetes.io/projected/67fbe9e8-1121-4091-954c-c6a620d98528-kube-api-access-vq6j5\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:01.410785 master-0 kubenswrapper[15202]: I0319 09:49:01.410722 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.411030 master-0 kubenswrapper[15202]: I0319 09:49:01.410833 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.411030 master-0 kubenswrapper[15202]: I0319 09:49:01.410883 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-config-data\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.411764 master-0 kubenswrapper[15202]: I0319 09:49:01.411698 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-kolla-config\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.412143 master-0 kubenswrapper[15202]: I0319 09:49:01.412116 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55gzb\" (UniqueName: \"kubernetes.io/projected/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-kube-api-access-55gzb\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.412835 master-0 kubenswrapper[15202]: I0319 09:49:01.412794 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-kolla-config\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.413114 master-0 kubenswrapper[15202]: I0319 09:49:01.413076 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-config-data\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.418969 master-0 kubenswrapper[15202]: I0319 09:49:01.418929 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.421731 master-0 kubenswrapper[15202]: I0319 09:49:01.420337 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.447412 master-0 kubenswrapper[15202]: I0319 09:49:01.447343 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gzb\" (UniqueName: \"kubernetes.io/projected/d9d44135-dd46-4cc3-aa4f-21c5b9d1604c-kube-api-access-55gzb\") pod \"memcached-0\" (UID: \"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c\") " pod="openstack/memcached-0" Mar 19 09:49:01.602677 master-0 kubenswrapper[15202]: I0319 09:49:01.602562 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 09:49:02.342575 master-0 kubenswrapper[15202]: I0319 09:49:02.336578 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 09:49:02.551925 master-0 kubenswrapper[15202]: I0319 09:49:02.528199 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-958cfe64-d1d3-4ec7-a3d8-81cbd46a10b2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^59a1f797-a1db-4e8c-806d-60dedc1586cc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9bab9d65-06f1-4b08-aa8c-5f12e7d06183\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:02.605008 master-0 kubenswrapper[15202]: I0319 09:49:02.603240 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 09:49:02.607585 master-0 kubenswrapper[15202]: I0319 09:49:02.605148 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 09:49:02.614003 master-0 kubenswrapper[15202]: I0319 09:49:02.611583 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 09:49:02.614003 master-0 kubenswrapper[15202]: I0319 09:49:02.611846 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 09:49:02.614003 master-0 kubenswrapper[15202]: I0319 09:49:02.612153 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 09:49:02.695513 master-0 kubenswrapper[15202]: I0319 09:49:02.695269 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:02.695871 master-0 kubenswrapper[15202]: I0319 09:49:02.695569 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770095 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770183 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770239 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770257 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770278 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770301 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770344 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-634204c0-c830-46d5-93f3-06cce770b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^02232a54-54e3-4e22-a6ea-790f3b963eb1\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.776500 master-0 kubenswrapper[15202]: I0319 09:49:02.770381 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkcx\" (UniqueName: \"kubernetes.io/projected/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-kube-api-access-nqkcx\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877112 master-0 kubenswrapper[15202]: I0319 09:49:02.876942 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-634204c0-c830-46d5-93f3-06cce770b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^02232a54-54e3-4e22-a6ea-790f3b963eb1\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877112 master-0 kubenswrapper[15202]: I0319 09:49:02.877075 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkcx\" (UniqueName: \"kubernetes.io/projected/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-kube-api-access-nqkcx\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877399 master-0 kubenswrapper[15202]: I0319 09:49:02.877144 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877399 master-0 kubenswrapper[15202]: I0319 09:49:02.877182 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877399 master-0 kubenswrapper[15202]: I0319 09:49:02.877244 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877399 master-0 kubenswrapper[15202]: I0319 09:49:02.877266 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877399 master-0 kubenswrapper[15202]: I0319 09:49:02.877289 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.877399 master-0 kubenswrapper[15202]: I0319 09:49:02.877319 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.886794 master-0 kubenswrapper[15202]: I0319 09:49:02.886731 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.888670 master-0 kubenswrapper[15202]: I0319 09:49:02.888593 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.891634 master-0 kubenswrapper[15202]: I0319 09:49:02.891602 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:02.891726 master-0 kubenswrapper[15202]: I0319 09:49:02.891648 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-634204c0-c830-46d5-93f3-06cce770b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^02232a54-54e3-4e22-a6ea-790f3b963eb1\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0f797daba00e4247ca28ba82dfb01876edef1e43920377ffb22a55f720109e67/globalmount\"" pod="openstack/openstack-galera-0" Mar 19 09:49:02.892330 master-0 kubenswrapper[15202]: I0319 09:49:02.892291 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.893543 master-0 kubenswrapper[15202]: I0319 09:49:02.893378 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.926808 master-0 kubenswrapper[15202]: I0319 09:49:02.920237 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.949113 master-0 kubenswrapper[15202]: I0319 09:49:02.948947 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkcx\" (UniqueName: \"kubernetes.io/projected/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-kube-api-access-nqkcx\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:02.979542 master-0 kubenswrapper[15202]: I0319 09:49:02.976969 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534e2f72-f4ac-40f2-8dad-a1100e7c67b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:03.149836 master-0 kubenswrapper[15202]: I0319 09:49:03.144999 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c","Type":"ContainerStarted","Data":"bbdf8fbfd911d448d16b6cf225cad614b7ed7fa96e258614483ce41f881d2e36"} Mar 19 09:49:03.968299 master-0 kubenswrapper[15202]: I0319 09:49:03.967451 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3fd58476-e6c9-4799-b98f-2b7147237a93\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ccef839e-d1c4-49fc-92f0-33a894a197b1\") pod \"rabbitmq-server-0\" (UID: \"67fbe9e8-1121-4091-954c-c6a620d98528\") " pod="openstack/rabbitmq-server-0" Mar 19 09:49:03.978831 master-0 kubenswrapper[15202]: I0319 09:49:03.978723 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 09:49:03.982426 master-0 kubenswrapper[15202]: I0319 09:49:03.982395 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:03.987053 master-0 kubenswrapper[15202]: I0319 09:49:03.984867 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 09:49:03.987053 master-0 kubenswrapper[15202]: I0319 09:49:03.985095 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 09:49:03.987053 master-0 kubenswrapper[15202]: I0319 09:49:03.985428 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 09:49:03.995873 master-0 kubenswrapper[15202]: I0319 09:49:03.992304 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 09:49:04.015557 master-0 kubenswrapper[15202]: I0319 09:49:04.006708 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.062992 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063067 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlq79\" (UniqueName: \"kubernetes.io/projected/a491330a-0016-4f3a-b003-bb80733aaaab-kube-api-access-mlq79\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063133 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a491330a-0016-4f3a-b003-bb80733aaaab-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063175 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e5a2d74-11f9-48e0-80c2-b9f406c2965e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ce3de23c-849f-418c-8f29-55ac4e165b8b\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063246 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a491330a-0016-4f3a-b003-bb80733aaaab-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063342 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a491330a-0016-4f3a-b003-bb80733aaaab-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063387 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.063972 master-0 kubenswrapper[15202]: I0319 09:49:04.063441 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.171870 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.172261 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.172435 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlq79\" (UniqueName: \"kubernetes.io/projected/a491330a-0016-4f3a-b003-bb80733aaaab-kube-api-access-mlq79\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.174966 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.175223 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a491330a-0016-4f3a-b003-bb80733aaaab-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.176760 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a491330a-0016-4f3a-b003-bb80733aaaab-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.178281 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e5a2d74-11f9-48e0-80c2-b9f406c2965e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ce3de23c-849f-418c-8f29-55ac4e165b8b\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.178436 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a491330a-0016-4f3a-b003-bb80733aaaab-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.178635 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a491330a-0016-4f3a-b003-bb80733aaaab-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.178706 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.179118 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.183268 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a491330a-0016-4f3a-b003-bb80733aaaab-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.185178 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:04.197665 master-0 kubenswrapper[15202]: I0319 09:49:04.185212 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e5a2d74-11f9-48e0-80c2-b9f406c2965e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ce3de23c-849f-418c-8f29-55ac4e165b8b\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/ec5cc27e63ba990b7dcb2da7180a0c264eb360cb2f78122318160aba8c8f10a3/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.214229 master-0 kubenswrapper[15202]: I0319 09:49:04.209066 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlq79\" (UniqueName: \"kubernetes.io/projected/a491330a-0016-4f3a-b003-bb80733aaaab-kube-api-access-mlq79\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.214229 master-0 kubenswrapper[15202]: I0319 09:49:04.211199 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:49:04.222968 master-0 kubenswrapper[15202]: I0319 09:49:04.214802 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a491330a-0016-4f3a-b003-bb80733aaaab-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.222968 master-0 kubenswrapper[15202]: I0319 09:49:04.215971 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a491330a-0016-4f3a-b003-bb80733aaaab-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:04.889583 master-0 kubenswrapper[15202]: I0319 09:49:04.888291 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-634204c0-c830-46d5-93f3-06cce770b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^02232a54-54e3-4e22-a6ea-790f3b963eb1\") pod \"openstack-galera-0\" (UID: \"534e2f72-f4ac-40f2-8dad-a1100e7c67b1\") " pod="openstack/openstack-galera-0" Mar 19 09:49:05.032440 master-0 kubenswrapper[15202]: I0319 09:49:05.032356 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 09:49:05.915232 master-0 kubenswrapper[15202]: I0319 09:49:05.915147 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e5a2d74-11f9-48e0-80c2-b9f406c2965e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ce3de23c-849f-418c-8f29-55ac4e165b8b\") pod \"openstack-cell1-galera-0\" (UID: \"a491330a-0016-4f3a-b003-bb80733aaaab\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:06.435200 master-0 kubenswrapper[15202]: I0319 09:49:06.433220 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:07.324302 master-0 kubenswrapper[15202]: I0319 09:49:07.324236 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 09:49:07.328203 master-0 kubenswrapper[15202]: I0319 09:49:07.328159 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.331270 master-0 kubenswrapper[15202]: I0319 09:49:07.331215 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 09:49:07.332568 master-0 kubenswrapper[15202]: I0319 09:49:07.331593 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 09:49:07.332568 master-0 kubenswrapper[15202]: I0319 09:49:07.331787 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 09:49:07.332568 master-0 kubenswrapper[15202]: I0319 09:49:07.332031 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 09:49:07.370604 master-0 kubenswrapper[15202]: I0319 09:49:07.368557 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 09:49:07.416400 master-0 kubenswrapper[15202]: I0319 09:49:07.416307 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.416400 master-0 kubenswrapper[15202]: I0319 09:49:07.416403 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.416898 master-0 kubenswrapper[15202]: I0319 09:49:07.416553 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.416898 master-0 kubenswrapper[15202]: I0319 09:49:07.416645 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.416898 master-0 kubenswrapper[15202]: I0319 09:49:07.416757 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efa4a51f-71e2-4b74-be2f-ade92b38c81c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fefdb13b-b69a-4161-9166-699301ff9a91\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.416898 master-0 kubenswrapper[15202]: I0319 09:49:07.416812 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-config\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.417137 master-0 kubenswrapper[15202]: I0319 09:49:07.416996 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsgh\" (UniqueName: \"kubernetes.io/projected/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-kube-api-access-dzsgh\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.417137 master-0 kubenswrapper[15202]: I0319 09:49:07.417089 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.466762 master-0 kubenswrapper[15202]: I0319 09:49:07.466612 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m68fw"] Mar 19 09:49:07.468689 master-0 kubenswrapper[15202]: I0319 09:49:07.468631 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.484476 master-0 kubenswrapper[15202]: I0319 09:49:07.482768 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-sl66q"] Mar 19 09:49:07.484679 master-0 kubenswrapper[15202]: I0319 09:49:07.484648 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 09:49:07.487528 master-0 kubenswrapper[15202]: I0319 09:49:07.484872 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 09:49:07.487528 master-0 kubenswrapper[15202]: I0319 09:49:07.486672 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.508726 master-0 kubenswrapper[15202]: I0319 09:49:07.508029 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m68fw"] Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518370 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518454 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518499 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-scripts\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518538 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518589 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518613 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgpd2\" (UniqueName: \"kubernetes.io/projected/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-kube-api-access-pgpd2\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518644 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-log-ovn\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518686 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efa4a51f-71e2-4b74-be2f-ade92b38c81c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fefdb13b-b69a-4161-9166-699301ff9a91\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518704 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-run\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518727 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-ovn-controller-tls-certs\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518745 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-combined-ca-bundle\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518766 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-config\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518783 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsgh\" (UniqueName: \"kubernetes.io/projected/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-kube-api-access-dzsgh\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518805 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-run-ovn\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.519232 master-0 kubenswrapper[15202]: I0319 09:49:07.518820 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.523971 master-0 kubenswrapper[15202]: I0319 09:49:07.520268 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-config\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.523971 master-0 kubenswrapper[15202]: I0319 09:49:07.521154 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.523971 master-0 kubenswrapper[15202]: I0319 09:49:07.522227 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.524115 master-0 kubenswrapper[15202]: I0319 09:49:07.524098 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:07.524162 master-0 kubenswrapper[15202]: I0319 09:49:07.524133 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efa4a51f-71e2-4b74-be2f-ade92b38c81c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fefdb13b-b69a-4161-9166-699301ff9a91\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/27f9c8762726cdb3117cdac3264e1c9340b7138677ad7d8906e3579f10989d95/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.530137 master-0 kubenswrapper[15202]: I0319 09:49:07.530076 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sl66q"] Mar 19 09:49:07.546256 master-0 kubenswrapper[15202]: I0319 09:49:07.546218 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.560558 master-0 kubenswrapper[15202]: I0319 09:49:07.554631 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsgh\" (UniqueName: \"kubernetes.io/projected/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-kube-api-access-dzsgh\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.560558 master-0 kubenswrapper[15202]: I0319 09:49:07.557829 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.560558 master-0 kubenswrapper[15202]: I0319 09:49:07.558562 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/756b7d24-4b2a-48d2-b574-c0a2f3f9a411-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:07.620846 master-0 kubenswrapper[15202]: I0319 09:49:07.620787 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-log-ovn\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621250 master-0 kubenswrapper[15202]: I0319 09:49:07.621064 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-run\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621250 master-0 kubenswrapper[15202]: I0319 09:49:07.621132 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-etc-ovs\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.621250 master-0 kubenswrapper[15202]: I0319 09:49:07.621208 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-ovn-controller-tls-certs\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621434 master-0 kubenswrapper[15202]: I0319 09:49:07.621265 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-combined-ca-bundle\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621434 master-0 kubenswrapper[15202]: I0319 09:49:07.621367 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-run-ovn\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621434 master-0 kubenswrapper[15202]: I0319 09:49:07.621373 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-log-ovn\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621434 master-0 kubenswrapper[15202]: I0319 09:49:07.621424 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-run\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621601 master-0 kubenswrapper[15202]: I0319 09:49:07.621490 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d4e5d5b-673c-4292-8b11-b58920594cf5-scripts\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.621601 master-0 kubenswrapper[15202]: I0319 09:49:07.621532 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znb9f\" (UniqueName: \"kubernetes.io/projected/5d4e5d5b-673c-4292-8b11-b58920594cf5-kube-api-access-znb9f\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.621672 master-0 kubenswrapper[15202]: I0319 09:49:07.621641 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-log\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.621756 master-0 kubenswrapper[15202]: I0319 09:49:07.621674 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-scripts\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.621756 master-0 kubenswrapper[15202]: I0319 09:49:07.621703 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-run\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.621941 master-0 kubenswrapper[15202]: I0319 09:49:07.621906 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgpd2\" (UniqueName: \"kubernetes.io/projected/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-kube-api-access-pgpd2\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.622065 master-0 kubenswrapper[15202]: I0319 09:49:07.622043 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-lib\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.622281 master-0 kubenswrapper[15202]: I0319 09:49:07.622244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-var-run-ovn\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.626531 master-0 kubenswrapper[15202]: I0319 09:49:07.624565 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-scripts\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.626531 master-0 kubenswrapper[15202]: I0319 09:49:07.624744 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-ovn-controller-tls-certs\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.626531 master-0 kubenswrapper[15202]: I0319 09:49:07.625382 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-combined-ca-bundle\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.724281 master-0 kubenswrapper[15202]: I0319 09:49:07.724196 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-lib\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.724325 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-etc-ovs\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.724502 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d4e5d5b-673c-4292-8b11-b58920594cf5-scripts\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.724543 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-lib\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.724557 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znb9f\" (UniqueName: \"kubernetes.io/projected/5d4e5d5b-673c-4292-8b11-b58920594cf5-kube-api-access-znb9f\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.724845 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-etc-ovs\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.724958 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-log\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.725038 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-run\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.725751 master-0 kubenswrapper[15202]: I0319 09:49:07.725087 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-log\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.731905 master-0 kubenswrapper[15202]: I0319 09:49:07.726700 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d4e5d5b-673c-4292-8b11-b58920594cf5-var-run\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.731905 master-0 kubenswrapper[15202]: I0319 09:49:07.727535 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d4e5d5b-673c-4292-8b11-b58920594cf5-scripts\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.824206 master-0 kubenswrapper[15202]: I0319 09:49:07.824142 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znb9f\" (UniqueName: \"kubernetes.io/projected/5d4e5d5b-673c-4292-8b11-b58920594cf5-kube-api-access-znb9f\") pod \"ovn-controller-ovs-sl66q\" (UID: \"5d4e5d5b-673c-4292-8b11-b58920594cf5\") " pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:07.833149 master-0 kubenswrapper[15202]: I0319 09:49:07.833086 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgpd2\" (UniqueName: \"kubernetes.io/projected/2ed2b7a9-27a6-43ac-ba84-ae1a7d670160-kube-api-access-pgpd2\") pod \"ovn-controller-m68fw\" (UID: \"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160\") " pod="openstack/ovn-controller-m68fw" Mar 19 09:49:07.906508 master-0 kubenswrapper[15202]: I0319 09:49:07.906304 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:08.118398 master-0 kubenswrapper[15202]: I0319 09:49:08.118294 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw" Mar 19 09:49:08.723976 master-0 kubenswrapper[15202]: W0319 09:49:08.723906 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bab9d65_06f1_4b08_aa8c_5f12e7d06183.slice/crio-e8ded007440dfb4e7c5b4e9e21198f437e1483568c19559c15400cc2b5962cfc WatchSource:0}: Error finding container e8ded007440dfb4e7c5b4e9e21198f437e1483568c19559c15400cc2b5962cfc: Status 404 returned error can't find the container with id e8ded007440dfb4e7c5b4e9e21198f437e1483568c19559c15400cc2b5962cfc Mar 19 09:49:09.024127 master-0 kubenswrapper[15202]: I0319 09:49:09.023994 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efa4a51f-71e2-4b74-be2f-ade92b38c81c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fefdb13b-b69a-4161-9166-699301ff9a91\") pod \"ovsdbserver-nb-0\" (UID: \"756b7d24-4b2a-48d2-b574-c0a2f3f9a411\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:09.179720 master-0 kubenswrapper[15202]: I0319 09:49:09.179664 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:09.238859 master-0 kubenswrapper[15202]: I0319 09:49:09.238814 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9bab9d65-06f1-4b08-aa8c-5f12e7d06183","Type":"ContainerStarted","Data":"e8ded007440dfb4e7c5b4e9e21198f437e1483568c19559c15400cc2b5962cfc"} Mar 19 09:49:10.481673 master-0 kubenswrapper[15202]: I0319 09:49:10.481612 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 09:49:10.483278 master-0 kubenswrapper[15202]: I0319 09:49:10.483228 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.487585 master-0 kubenswrapper[15202]: I0319 09:49:10.486830 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 09:49:10.488599 master-0 kubenswrapper[15202]: I0319 09:49:10.487883 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 09:49:10.493655 master-0 kubenswrapper[15202]: I0319 09:49:10.491332 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 09:49:10.501824 master-0 kubenswrapper[15202]: I0319 09:49:10.501743 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592061 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592144 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5739f34d-56e3-4305-8f93-bf6d6636f5e6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592183 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbxjw\" (UniqueName: \"kubernetes.io/projected/5739f34d-56e3-4305-8f93-bf6d6636f5e6-kube-api-access-gbxjw\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592211 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5739f34d-56e3-4305-8f93-bf6d6636f5e6-config\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592247 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a5fe18d-7bfe-4749-8084-375f18d4d707\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d5f1a758-6ea3-433b-b1a7-31b1016678d7\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592290 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5739f34d-56e3-4305-8f93-bf6d6636f5e6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.592510 master-0 kubenswrapper[15202]: I0319 09:49:10.592368 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.593140 master-0 kubenswrapper[15202]: I0319 09:49:10.592583 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.693982 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.694133 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.694190 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.694213 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5739f34d-56e3-4305-8f93-bf6d6636f5e6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.694398 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbxjw\" (UniqueName: \"kubernetes.io/projected/5739f34d-56e3-4305-8f93-bf6d6636f5e6-kube-api-access-gbxjw\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.694478 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5739f34d-56e3-4305-8f93-bf6d6636f5e6-config\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.694553 master-0 kubenswrapper[15202]: I0319 09:49:10.694536 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a5fe18d-7bfe-4749-8084-375f18d4d707\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d5f1a758-6ea3-433b-b1a7-31b1016678d7\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.695013 master-0 kubenswrapper[15202]: I0319 09:49:10.694612 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5739f34d-56e3-4305-8f93-bf6d6636f5e6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.695013 master-0 kubenswrapper[15202]: I0319 09:49:10.694917 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5739f34d-56e3-4305-8f93-bf6d6636f5e6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.696345 master-0 kubenswrapper[15202]: I0319 09:49:10.696309 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:10.696430 master-0 kubenswrapper[15202]: I0319 09:49:10.696348 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a5fe18d-7bfe-4749-8084-375f18d4d707\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d5f1a758-6ea3-433b-b1a7-31b1016678d7\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/da7a2b774da3e538d8ec281067cc18832d574ddfc7072d3db27250b064d1cb5d/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.699129 master-0 kubenswrapper[15202]: I0319 09:49:10.699085 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5739f34d-56e3-4305-8f93-bf6d6636f5e6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.699775 master-0 kubenswrapper[15202]: I0319 09:49:10.699739 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.700546 master-0 kubenswrapper[15202]: I0319 09:49:10.700501 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.707330 master-0 kubenswrapper[15202]: I0319 09:49:10.701854 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5739f34d-56e3-4305-8f93-bf6d6636f5e6-config\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.716014 master-0 kubenswrapper[15202]: I0319 09:49:10.711405 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbxjw\" (UniqueName: \"kubernetes.io/projected/5739f34d-56e3-4305-8f93-bf6d6636f5e6-kube-api-access-gbxjw\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:10.719099 master-0 kubenswrapper[15202]: I0319 09:49:10.719048 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5739f34d-56e3-4305-8f93-bf6d6636f5e6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:12.267139 master-0 kubenswrapper[15202]: I0319 09:49:12.267093 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a5fe18d-7bfe-4749-8084-375f18d4d707\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d5f1a758-6ea3-433b-b1a7-31b1016678d7\") pod \"ovsdbserver-sb-0\" (UID: \"5739f34d-56e3-4305-8f93-bf6d6636f5e6\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:12.341658 master-0 kubenswrapper[15202]: I0319 09:49:12.341174 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:19.711756 master-0 kubenswrapper[15202]: I0319 09:49:19.711690 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 09:49:20.987924 master-0 kubenswrapper[15202]: W0319 09:49:20.987866 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda491330a_0016_4f3a_b003_bb80733aaaab.slice/crio-791b65962699fd70f033250969f5aafc6a04d2c8a3df50e5c62b46cb415fc9f3 WatchSource:0}: Error finding container 791b65962699fd70f033250969f5aafc6a04d2c8a3df50e5c62b46cb415fc9f3: Status 404 returned error can't find the container with id 791b65962699fd70f033250969f5aafc6a04d2c8a3df50e5c62b46cb415fc9f3 Mar 19 09:49:21.033348 master-0 kubenswrapper[15202]: I0319 09:49:21.032908 15202 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:49:21.394371 master-0 kubenswrapper[15202]: I0319 09:49:21.394321 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a491330a-0016-4f3a-b003-bb80733aaaab","Type":"ContainerStarted","Data":"791b65962699fd70f033250969f5aafc6a04d2c8a3df50e5c62b46cb415fc9f3"} Mar 19 09:49:21.482823 master-0 kubenswrapper[15202]: I0319 09:49:21.479796 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:49:21.660796 master-0 kubenswrapper[15202]: I0319 09:49:21.660376 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 09:49:21.700629 master-0 kubenswrapper[15202]: W0319 09:49:21.700589 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534e2f72_f4ac_40f2_8dad_a1100e7c67b1.slice/crio-37dbc0ccefad5479844c8120704547906ce44ca6672ded97236bbfe2f0f7cd1f WatchSource:0}: Error finding container 37dbc0ccefad5479844c8120704547906ce44ca6672ded97236bbfe2f0f7cd1f: Status 404 returned error can't find the container with id 37dbc0ccefad5479844c8120704547906ce44ca6672ded97236bbfe2f0f7cd1f Mar 19 09:49:22.006915 master-0 kubenswrapper[15202]: I0319 09:49:22.006816 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m68fw"] Mar 19 09:49:22.025854 master-0 kubenswrapper[15202]: W0319 09:49:22.021708 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ed2b7a9_27a6_43ac_ba84_ae1a7d670160.slice/crio-6f49fd5cf0c72c20881179cbea2cb254bd41eb5f66dc0f4e29bab4d5f49383c7 WatchSource:0}: Error finding container 6f49fd5cf0c72c20881179cbea2cb254bd41eb5f66dc0f4e29bab4d5f49383c7: Status 404 returned error can't find the container with id 6f49fd5cf0c72c20881179cbea2cb254bd41eb5f66dc0f4e29bab4d5f49383c7 Mar 19 09:49:22.216819 master-0 kubenswrapper[15202]: I0319 09:49:22.216769 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sl66q"] Mar 19 09:49:22.308915 master-0 kubenswrapper[15202]: W0319 09:49:22.308774 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4e5d5b_673c_4292_8b11_b58920594cf5.slice/crio-68effc7dc5fa41bf2f0e8825da3cb74995405c91c5f22aff25fafbadadfd104c WatchSource:0}: Error finding container 68effc7dc5fa41bf2f0e8825da3cb74995405c91c5f22aff25fafbadadfd104c: Status 404 returned error can't find the container with id 68effc7dc5fa41bf2f0e8825da3cb74995405c91c5f22aff25fafbadadfd104c Mar 19 09:49:22.412587 master-0 kubenswrapper[15202]: I0319 09:49:22.412022 15202 generic.go:334] "Generic (PLEG): container finished" podID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerID="496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527" exitCode=0 Mar 19 09:49:22.412587 master-0 kubenswrapper[15202]: I0319 09:49:22.412103 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-8tphm" event={"ID":"784cfe54-5ee5-4c81-a106-d785d5803e58","Type":"ContainerDied","Data":"496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527"} Mar 19 09:49:22.419225 master-0 kubenswrapper[15202]: I0319 09:49:22.419144 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw" event={"ID":"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160","Type":"ContainerStarted","Data":"6f49fd5cf0c72c20881179cbea2cb254bd41eb5f66dc0f4e29bab4d5f49383c7"} Mar 19 09:49:22.422928 master-0 kubenswrapper[15202]: I0319 09:49:22.422886 15202 generic.go:334] "Generic (PLEG): container finished" podID="dd3e6a54-c98e-4598-972d-2d1ab10797db" containerID="b60803831ff41b4995ed2ab8b7ac03cf4da34d7c44aabdbed4c9c40bac0bd4a7" exitCode=0 Mar 19 09:49:22.423022 master-0 kubenswrapper[15202]: I0319 09:49:22.422993 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" event={"ID":"dd3e6a54-c98e-4598-972d-2d1ab10797db","Type":"ContainerDied","Data":"b60803831ff41b4995ed2ab8b7ac03cf4da34d7c44aabdbed4c9c40bac0bd4a7"} Mar 19 09:49:22.427521 master-0 kubenswrapper[15202]: I0319 09:49:22.426981 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67fbe9e8-1121-4091-954c-c6a620d98528","Type":"ContainerStarted","Data":"4bb760ab8dfb2a3088f0ada7d24150963ccfbfb0e7e49269bedcb84b261d8fd9"} Mar 19 09:49:22.433914 master-0 kubenswrapper[15202]: I0319 09:49:22.433735 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d9d44135-dd46-4cc3-aa4f-21c5b9d1604c","Type":"ContainerStarted","Data":"a70642051a01296ffbbd3fe7fd94e3bab9fdda73aa15eb4ae881395699e5b065"} Mar 19 09:49:22.434125 master-0 kubenswrapper[15202]: I0319 09:49:22.434094 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 09:49:22.440869 master-0 kubenswrapper[15202]: I0319 09:49:22.440777 15202 generic.go:334] "Generic (PLEG): container finished" podID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerID="483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb" exitCode=0 Mar 19 09:49:22.441154 master-0 kubenswrapper[15202]: I0319 09:49:22.441131 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" event={"ID":"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7","Type":"ContainerDied","Data":"483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb"} Mar 19 09:49:22.443909 master-0 kubenswrapper[15202]: I0319 09:49:22.443374 15202 generic.go:334] "Generic (PLEG): container finished" podID="ddd576b4-a565-472e-a6a4-7c14e86f9458" containerID="0cf6f1fa50f5474689b7ad78149738cd5b0c17e97bdc51e18369debe139fe52a" exitCode=0 Mar 19 09:49:22.443909 master-0 kubenswrapper[15202]: I0319 09:49:22.443487 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" event={"ID":"ddd576b4-a565-472e-a6a4-7c14e86f9458","Type":"ContainerDied","Data":"0cf6f1fa50f5474689b7ad78149738cd5b0c17e97bdc51e18369debe139fe52a"} Mar 19 09:49:22.446357 master-0 kubenswrapper[15202]: I0319 09:49:22.446259 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"534e2f72-f4ac-40f2-8dad-a1100e7c67b1","Type":"ContainerStarted","Data":"37dbc0ccefad5479844c8120704547906ce44ca6672ded97236bbfe2f0f7cd1f"} Mar 19 09:49:22.450576 master-0 kubenswrapper[15202]: I0319 09:49:22.450515 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sl66q" event={"ID":"5d4e5d5b-673c-4292-8b11-b58920594cf5","Type":"ContainerStarted","Data":"68effc7dc5fa41bf2f0e8825da3cb74995405c91c5f22aff25fafbadadfd104c"} Mar 19 09:49:22.511081 master-0 kubenswrapper[15202]: I0319 09:49:22.511015 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.7568023 podStartE2EDuration="21.510996906s" podCreationTimestamp="2026-03-19 09:49:01 +0000 UTC" firstStartedPulling="2026-03-19 09:49:02.36728573 +0000 UTC m=+1459.752700546" lastFinishedPulling="2026-03-19 09:49:21.121480326 +0000 UTC m=+1478.506895152" observedRunningTime="2026-03-19 09:49:22.506884695 +0000 UTC m=+1479.892299501" watchObservedRunningTime="2026-03-19 09:49:22.510996906 +0000 UTC m=+1479.896411722" Mar 19 09:49:23.066523 master-0 kubenswrapper[15202]: I0319 09:49:23.066428 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 09:49:23.364983 master-0 kubenswrapper[15202]: I0319 09:49:23.364938 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:49:23.396318 master-0 kubenswrapper[15202]: I0319 09:49:23.395766 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:49:23.404530 master-0 kubenswrapper[15202]: I0319 09:49:23.404454 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjxbh\" (UniqueName: \"kubernetes.io/projected/dd3e6a54-c98e-4598-972d-2d1ab10797db-kube-api-access-zjxbh\") pod \"dd3e6a54-c98e-4598-972d-2d1ab10797db\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " Mar 19 09:49:23.404773 master-0 kubenswrapper[15202]: I0319 09:49:23.404751 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd576b4-a565-472e-a6a4-7c14e86f9458-config\") pod \"ddd576b4-a565-472e-a6a4-7c14e86f9458\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " Mar 19 09:49:23.404858 master-0 kubenswrapper[15202]: I0319 09:49:23.404836 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-dns-svc\") pod \"dd3e6a54-c98e-4598-972d-2d1ab10797db\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " Mar 19 09:49:23.404952 master-0 kubenswrapper[15202]: I0319 09:49:23.404913 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-config\") pod \"dd3e6a54-c98e-4598-972d-2d1ab10797db\" (UID: \"dd3e6a54-c98e-4598-972d-2d1ab10797db\") " Mar 19 09:49:23.405002 master-0 kubenswrapper[15202]: I0319 09:49:23.404973 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpw7x\" (UniqueName: \"kubernetes.io/projected/ddd576b4-a565-472e-a6a4-7c14e86f9458-kube-api-access-kpw7x\") pod \"ddd576b4-a565-472e-a6a4-7c14e86f9458\" (UID: \"ddd576b4-a565-472e-a6a4-7c14e86f9458\") " Mar 19 09:49:23.417389 master-0 kubenswrapper[15202]: I0319 09:49:23.417333 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3e6a54-c98e-4598-972d-2d1ab10797db-kube-api-access-zjxbh" (OuterVolumeSpecName: "kube-api-access-zjxbh") pod "dd3e6a54-c98e-4598-972d-2d1ab10797db" (UID: "dd3e6a54-c98e-4598-972d-2d1ab10797db"). InnerVolumeSpecName "kube-api-access-zjxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:23.464269 master-0 kubenswrapper[15202]: I0319 09:49:23.463354 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd576b4-a565-472e-a6a4-7c14e86f9458-kube-api-access-kpw7x" (OuterVolumeSpecName: "kube-api-access-kpw7x") pod "ddd576b4-a565-472e-a6a4-7c14e86f9458" (UID: "ddd576b4-a565-472e-a6a4-7c14e86f9458"). InnerVolumeSpecName "kube-api-access-kpw7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:23.478390 master-0 kubenswrapper[15202]: I0319 09:49:23.478333 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" event={"ID":"ddd576b4-a565-472e-a6a4-7c14e86f9458","Type":"ContainerDied","Data":"d0779330e323b142507d22f2b4cd2519b8531c9c5272086f2cc2819a73cbb1af"} Mar 19 09:49:23.478390 master-0 kubenswrapper[15202]: I0319 09:49:23.478394 15202 scope.go:117] "RemoveContainer" containerID="0cf6f1fa50f5474689b7ad78149738cd5b0c17e97bdc51e18369debe139fe52a" Mar 19 09:49:23.478633 master-0 kubenswrapper[15202]: I0319 09:49:23.478602 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cdfrk" Mar 19 09:49:23.486035 master-0 kubenswrapper[15202]: I0319 09:49:23.485982 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd3e6a54-c98e-4598-972d-2d1ab10797db" (UID: "dd3e6a54-c98e-4598-972d-2d1ab10797db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:23.488155 master-0 kubenswrapper[15202]: I0319 09:49:23.488073 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9bab9d65-06f1-4b08-aa8c-5f12e7d06183","Type":"ContainerStarted","Data":"ebec7d98a3d83cb31d22f9e455d5d4152c2b8a16ffdd6185b239bdfe6662a137"} Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.497739 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd576b4-a565-472e-a6a4-7c14e86f9458-config" (OuterVolumeSpecName: "config") pod "ddd576b4-a565-472e-a6a4-7c14e86f9458" (UID: "ddd576b4-a565-472e-a6a4-7c14e86f9458"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.500505 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"756b7d24-4b2a-48d2-b574-c0a2f3f9a411","Type":"ContainerStarted","Data":"5bbc1e72e3a6e902995e35ca79c778203a05788e57b0dbc9cc21676deb2fd073"} Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.503684 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" event={"ID":"dd3e6a54-c98e-4598-972d-2d1ab10797db","Type":"ContainerDied","Data":"bc909673f398054c9b8d92bce53a62f4b4ce1e712c5a891f2d7b870c000b836d"} Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.503767 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-6bm4q" Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.506610 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjxbh\" (UniqueName: \"kubernetes.io/projected/dd3e6a54-c98e-4598-972d-2d1ab10797db-kube-api-access-zjxbh\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.506629 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd576b4-a565-472e-a6a4-7c14e86f9458-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.506639 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.506648 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpw7x\" (UniqueName: \"kubernetes.io/projected/ddd576b4-a565-472e-a6a4-7c14e86f9458-kube-api-access-kpw7x\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:23.512056 master-0 kubenswrapper[15202]: I0319 09:49:23.507104 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67fbe9e8-1121-4091-954c-c6a620d98528","Type":"ContainerStarted","Data":"e1a908256ed8d079197cd9a0453f42cc838e68649df1a123e8b31c033bc7d0a4"} Mar 19 09:49:23.513959 master-0 kubenswrapper[15202]: I0319 09:49:23.513911 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-config" (OuterVolumeSpecName: "config") pod "dd3e6a54-c98e-4598-972d-2d1ab10797db" (UID: "dd3e6a54-c98e-4598-972d-2d1ab10797db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:23.526534 master-0 kubenswrapper[15202]: I0319 09:49:23.526462 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" event={"ID":"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7","Type":"ContainerStarted","Data":"23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001"} Mar 19 09:49:23.526901 master-0 kubenswrapper[15202]: I0319 09:49:23.526875 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:49:23.554841 master-0 kubenswrapper[15202]: I0319 09:49:23.548811 15202 scope.go:117] "RemoveContainer" containerID="b60803831ff41b4995ed2ab8b7ac03cf4da34d7c44aabdbed4c9c40bac0bd4a7" Mar 19 09:49:23.613373 master-0 kubenswrapper[15202]: I0319 09:49:23.613311 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3e6a54-c98e-4598-972d-2d1ab10797db-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:23.618816 master-0 kubenswrapper[15202]: I0319 09:49:23.618728 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" podStartSLOduration=4.457602578 podStartE2EDuration="27.61870867s" podCreationTimestamp="2026-03-19 09:48:56 +0000 UTC" firstStartedPulling="2026-03-19 09:48:57.959497592 +0000 UTC m=+1455.344912408" lastFinishedPulling="2026-03-19 09:49:21.120603684 +0000 UTC m=+1478.506018500" observedRunningTime="2026-03-19 09:49:23.617550561 +0000 UTC m=+1481.002965397" watchObservedRunningTime="2026-03-19 09:49:23.61870867 +0000 UTC m=+1481.004123486" Mar 19 09:49:23.879002 master-0 kubenswrapper[15202]: I0319 09:49:23.878602 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 09:49:23.880489 master-0 kubenswrapper[15202]: I0319 09:49:23.880442 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cdfrk"] Mar 19 09:49:23.905161 master-0 kubenswrapper[15202]: I0319 09:49:23.905087 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cdfrk"] Mar 19 09:49:23.946362 master-0 kubenswrapper[15202]: I0319 09:49:23.946303 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-6bm4q"] Mar 19 09:49:23.949257 master-0 kubenswrapper[15202]: I0319 09:49:23.949233 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-6bm4q"] Mar 19 09:49:24.550818 master-0 kubenswrapper[15202]: I0319 09:49:24.550549 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-8tphm" event={"ID":"784cfe54-5ee5-4c81-a106-d785d5803e58","Type":"ContainerStarted","Data":"627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd"} Mar 19 09:49:24.551995 master-0 kubenswrapper[15202]: I0319 09:49:24.550867 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:49:24.555126 master-0 kubenswrapper[15202]: I0319 09:49:24.555038 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5739f34d-56e3-4305-8f93-bf6d6636f5e6","Type":"ContainerStarted","Data":"aca1a0968c7bc6cbb9b4712cb488c3a7797cc33fe4d8a7eb4d02c0ce856b5758"} Mar 19 09:49:24.585313 master-0 kubenswrapper[15202]: I0319 09:49:24.584896 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76849d6659-8tphm" podStartSLOduration=4.499921641 podStartE2EDuration="28.584870505s" podCreationTimestamp="2026-03-19 09:48:56 +0000 UTC" firstStartedPulling="2026-03-19 09:48:57.214778585 +0000 UTC m=+1454.600193401" lastFinishedPulling="2026-03-19 09:49:21.299727449 +0000 UTC m=+1478.685142265" observedRunningTime="2026-03-19 09:49:24.577666698 +0000 UTC m=+1481.963081514" watchObservedRunningTime="2026-03-19 09:49:24.584870505 +0000 UTC m=+1481.970285311" Mar 19 09:49:24.826307 master-0 kubenswrapper[15202]: I0319 09:49:24.826129 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3e6a54-c98e-4598-972d-2d1ab10797db" path="/var/lib/kubelet/pods/dd3e6a54-c98e-4598-972d-2d1ab10797db/volumes" Mar 19 09:49:24.826733 master-0 kubenswrapper[15202]: I0319 09:49:24.826708 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd576b4-a565-472e-a6a4-7c14e86f9458" path="/var/lib/kubelet/pods/ddd576b4-a565-472e-a6a4-7c14e86f9458/volumes" Mar 19 09:49:26.603679 master-0 kubenswrapper[15202]: I0319 09:49:26.603592 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 09:49:30.639407 master-0 kubenswrapper[15202]: I0319 09:49:30.639317 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a491330a-0016-4f3a-b003-bb80733aaaab","Type":"ContainerStarted","Data":"c9c8c0477af17f362345a1a03feda2123d40e594da189f4f9dc9ec5d13c3d1d5"} Mar 19 09:49:30.640827 master-0 kubenswrapper[15202]: I0319 09:49:30.640788 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5739f34d-56e3-4305-8f93-bf6d6636f5e6","Type":"ContainerStarted","Data":"97c9e313b9bc4fd9f01166e1a676f6be9082472a734e5e454339cdf1ab8101da"} Mar 19 09:49:30.642921 master-0 kubenswrapper[15202]: I0319 09:49:30.642882 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"756b7d24-4b2a-48d2-b574-c0a2f3f9a411","Type":"ContainerStarted","Data":"a4edcb8c245f0744e0a0b5efb684f96ab69ce53d9cfb0cc67d3b4e3624268fb8"} Mar 19 09:49:30.645876 master-0 kubenswrapper[15202]: I0319 09:49:30.645826 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"534e2f72-f4ac-40f2-8dad-a1100e7c67b1","Type":"ContainerStarted","Data":"3529f469b9b9de1b3d88dc4bd66aa6952a06eccc082e7712925611b45c18629c"} Mar 19 09:49:30.648609 master-0 kubenswrapper[15202]: I0319 09:49:30.648548 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw" event={"ID":"2ed2b7a9-27a6-43ac-ba84-ae1a7d670160","Type":"ContainerStarted","Data":"3ad1bb8c0dbfc4d9d5235bc48534c15d687fcc44bb42cd7555b1207bbeee99b8"} Mar 19 09:49:30.648863 master-0 kubenswrapper[15202]: I0319 09:49:30.648834 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m68fw" Mar 19 09:49:30.651289 master-0 kubenswrapper[15202]: I0319 09:49:30.651238 15202 generic.go:334] "Generic (PLEG): container finished" podID="5d4e5d5b-673c-4292-8b11-b58920594cf5" containerID="85b40ba8fcd2c446ab6215c4b78a50763c88f3a8a5cebbce58f1c43091789c90" exitCode=0 Mar 19 09:49:30.651386 master-0 kubenswrapper[15202]: I0319 09:49:30.651298 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sl66q" event={"ID":"5d4e5d5b-673c-4292-8b11-b58920594cf5","Type":"ContainerDied","Data":"85b40ba8fcd2c446ab6215c4b78a50763c88f3a8a5cebbce58f1c43091789c90"} Mar 19 09:49:30.772942 master-0 kubenswrapper[15202]: I0319 09:49:30.772877 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m68fw" podStartSLOduration=16.139839228 podStartE2EDuration="23.772830255s" podCreationTimestamp="2026-03-19 09:49:07 +0000 UTC" firstStartedPulling="2026-03-19 09:49:22.034703186 +0000 UTC m=+1479.420118002" lastFinishedPulling="2026-03-19 09:49:29.667694223 +0000 UTC m=+1487.053109029" observedRunningTime="2026-03-19 09:49:30.716707281 +0000 UTC m=+1488.102122097" watchObservedRunningTime="2026-03-19 09:49:30.772830255 +0000 UTC m=+1488.158245071" Mar 19 09:49:31.643722 master-0 kubenswrapper[15202]: I0319 09:49:31.643659 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:49:31.691199 master-0 kubenswrapper[15202]: I0319 09:49:31.691086 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sl66q" event={"ID":"5d4e5d5b-673c-4292-8b11-b58920594cf5","Type":"ContainerStarted","Data":"0605fee7dc517465ef3e7c76f314344016004c50257a1fcb5eeb903e7f22db72"} Mar 19 09:49:31.691199 master-0 kubenswrapper[15202]: I0319 09:49:31.691199 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sl66q" event={"ID":"5d4e5d5b-673c-4292-8b11-b58920594cf5","Type":"ContainerStarted","Data":"e966af152e4cce0a070003f0d88db4b5201455c270937a88452b3202c0d87121"} Mar 19 09:49:31.693717 master-0 kubenswrapper[15202]: I0319 09:49:31.693691 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:31.693821 master-0 kubenswrapper[15202]: I0319 09:49:31.693727 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:49:31.781546 master-0 kubenswrapper[15202]: I0319 09:49:31.769325 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-sl66q" podStartSLOduration=17.413342438 podStartE2EDuration="24.769290216s" podCreationTimestamp="2026-03-19 09:49:07 +0000 UTC" firstStartedPulling="2026-03-19 09:49:22.311740395 +0000 UTC m=+1479.697155211" lastFinishedPulling="2026-03-19 09:49:29.667688173 +0000 UTC m=+1487.053102989" observedRunningTime="2026-03-19 09:49:31.741542633 +0000 UTC m=+1489.126957459" watchObservedRunningTime="2026-03-19 09:49:31.769290216 +0000 UTC m=+1489.154705032" Mar 19 09:49:32.273730 master-0 kubenswrapper[15202]: I0319 09:49:32.273668 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:49:32.385504 master-0 kubenswrapper[15202]: I0319 09:49:32.377992 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-8tphm"] Mar 19 09:49:32.385504 master-0 kubenswrapper[15202]: I0319 09:49:32.378288 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76849d6659-8tphm" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerName="dnsmasq-dns" containerID="cri-o://627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd" gracePeriod=10 Mar 19 09:49:36.614842 master-0 kubenswrapper[15202]: I0319 09:49:36.614790 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:49:36.706278 master-0 kubenswrapper[15202]: I0319 09:49:36.706193 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rb82\" (UniqueName: \"kubernetes.io/projected/784cfe54-5ee5-4c81-a106-d785d5803e58-kube-api-access-9rb82\") pod \"784cfe54-5ee5-4c81-a106-d785d5803e58\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " Mar 19 09:49:36.706548 master-0 kubenswrapper[15202]: I0319 09:49:36.706291 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-config\") pod \"784cfe54-5ee5-4c81-a106-d785d5803e58\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " Mar 19 09:49:36.706741 master-0 kubenswrapper[15202]: I0319 09:49:36.706712 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-dns-svc\") pod \"784cfe54-5ee5-4c81-a106-d785d5803e58\" (UID: \"784cfe54-5ee5-4c81-a106-d785d5803e58\") " Mar 19 09:49:36.711076 master-0 kubenswrapper[15202]: I0319 09:49:36.711012 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784cfe54-5ee5-4c81-a106-d785d5803e58-kube-api-access-9rb82" (OuterVolumeSpecName: "kube-api-access-9rb82") pod "784cfe54-5ee5-4c81-a106-d785d5803e58" (UID: "784cfe54-5ee5-4c81-a106-d785d5803e58"). InnerVolumeSpecName "kube-api-access-9rb82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:36.752633 master-0 kubenswrapper[15202]: I0319 09:49:36.752491 15202 generic.go:334] "Generic (PLEG): container finished" podID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerID="627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd" exitCode=0 Mar 19 09:49:36.752633 master-0 kubenswrapper[15202]: I0319 09:49:36.752559 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-8tphm" Mar 19 09:49:36.753299 master-0 kubenswrapper[15202]: I0319 09:49:36.752566 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-8tphm" event={"ID":"784cfe54-5ee5-4c81-a106-d785d5803e58","Type":"ContainerDied","Data":"627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd"} Mar 19 09:49:36.753299 master-0 kubenswrapper[15202]: I0319 09:49:36.752956 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-8tphm" event={"ID":"784cfe54-5ee5-4c81-a106-d785d5803e58","Type":"ContainerDied","Data":"4a8d780311e93fc4f04922e55d522df7cb06ca95982e7ab7f4f576b777d87741"} Mar 19 09:49:36.753299 master-0 kubenswrapper[15202]: I0319 09:49:36.752977 15202 scope.go:117] "RemoveContainer" containerID="627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd" Mar 19 09:49:36.757704 master-0 kubenswrapper[15202]: I0319 09:49:36.757634 15202 generic.go:334] "Generic (PLEG): container finished" podID="534e2f72-f4ac-40f2-8dad-a1100e7c67b1" containerID="3529f469b9b9de1b3d88dc4bd66aa6952a06eccc082e7712925611b45c18629c" exitCode=0 Mar 19 09:49:36.757776 master-0 kubenswrapper[15202]: I0319 09:49:36.757735 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"534e2f72-f4ac-40f2-8dad-a1100e7c67b1","Type":"ContainerDied","Data":"3529f469b9b9de1b3d88dc4bd66aa6952a06eccc082e7712925611b45c18629c"} Mar 19 09:49:36.760062 master-0 kubenswrapper[15202]: I0319 09:49:36.760025 15202 generic.go:334] "Generic (PLEG): container finished" podID="a491330a-0016-4f3a-b003-bb80733aaaab" containerID="c9c8c0477af17f362345a1a03feda2123d40e594da189f4f9dc9ec5d13c3d1d5" exitCode=0 Mar 19 09:49:36.760141 master-0 kubenswrapper[15202]: I0319 09:49:36.760056 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a491330a-0016-4f3a-b003-bb80733aaaab","Type":"ContainerDied","Data":"c9c8c0477af17f362345a1a03feda2123d40e594da189f4f9dc9ec5d13c3d1d5"} Mar 19 09:49:36.761340 master-0 kubenswrapper[15202]: I0319 09:49:36.761292 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-config" (OuterVolumeSpecName: "config") pod "784cfe54-5ee5-4c81-a106-d785d5803e58" (UID: "784cfe54-5ee5-4c81-a106-d785d5803e58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:36.767878 master-0 kubenswrapper[15202]: I0319 09:49:36.763523 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "784cfe54-5ee5-4c81-a106-d785d5803e58" (UID: "784cfe54-5ee5-4c81-a106-d785d5803e58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:36.805586 master-0 kubenswrapper[15202]: I0319 09:49:36.805532 15202 scope.go:117] "RemoveContainer" containerID="496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527" Mar 19 09:49:36.809096 master-0 kubenswrapper[15202]: I0319 09:49:36.809052 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:36.809177 master-0 kubenswrapper[15202]: I0319 09:49:36.809093 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rb82\" (UniqueName: \"kubernetes.io/projected/784cfe54-5ee5-4c81-a106-d785d5803e58-kube-api-access-9rb82\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:36.809177 master-0 kubenswrapper[15202]: I0319 09:49:36.809113 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784cfe54-5ee5-4c81-a106-d785d5803e58-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:36.829892 master-0 kubenswrapper[15202]: I0319 09:49:36.829844 15202 scope.go:117] "RemoveContainer" containerID="627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd" Mar 19 09:49:36.830266 master-0 kubenswrapper[15202]: E0319 09:49:36.830231 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd\": container with ID starting with 627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd not found: ID does not exist" containerID="627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd" Mar 19 09:49:36.830341 master-0 kubenswrapper[15202]: I0319 09:49:36.830264 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd"} err="failed to get container status \"627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd\": rpc error: code = NotFound desc = could not find container \"627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd\": container with ID starting with 627b99f8addcd8108ea36f58f7b3b75b4471d33db388f85020c918d7fd213efd not found: ID does not exist" Mar 19 09:49:36.830341 master-0 kubenswrapper[15202]: I0319 09:49:36.830301 15202 scope.go:117] "RemoveContainer" containerID="496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527" Mar 19 09:49:36.830583 master-0 kubenswrapper[15202]: E0319 09:49:36.830512 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527\": container with ID starting with 496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527 not found: ID does not exist" containerID="496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527" Mar 19 09:49:36.830583 master-0 kubenswrapper[15202]: I0319 09:49:36.830535 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527"} err="failed to get container status \"496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527\": rpc error: code = NotFound desc = could not find container \"496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527\": container with ID starting with 496c1cbadef2aeedde398c0fda844e14568d8fd54f7ea35ee5b0e6ded887a527 not found: ID does not exist" Mar 19 09:49:37.033983 master-0 kubenswrapper[15202]: I0319 09:49:37.033941 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-2qz2r"] Mar 19 09:49:37.034702 master-0 kubenswrapper[15202]: E0319 09:49:37.034684 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3e6a54-c98e-4598-972d-2d1ab10797db" containerName="init" Mar 19 09:49:37.034781 master-0 kubenswrapper[15202]: I0319 09:49:37.034771 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3e6a54-c98e-4598-972d-2d1ab10797db" containerName="init" Mar 19 09:49:37.034852 master-0 kubenswrapper[15202]: E0319 09:49:37.034842 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerName="dnsmasq-dns" Mar 19 09:49:37.034912 master-0 kubenswrapper[15202]: I0319 09:49:37.034902 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerName="dnsmasq-dns" Mar 19 09:49:37.034982 master-0 kubenswrapper[15202]: E0319 09:49:37.034972 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd576b4-a565-472e-a6a4-7c14e86f9458" containerName="init" Mar 19 09:49:37.035041 master-0 kubenswrapper[15202]: I0319 09:49:37.035031 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd576b4-a565-472e-a6a4-7c14e86f9458" containerName="init" Mar 19 09:49:37.035130 master-0 kubenswrapper[15202]: E0319 09:49:37.035118 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerName="init" Mar 19 09:49:37.035195 master-0 kubenswrapper[15202]: I0319 09:49:37.035185 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerName="init" Mar 19 09:49:37.035434 master-0 kubenswrapper[15202]: I0319 09:49:37.035420 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3e6a54-c98e-4598-972d-2d1ab10797db" containerName="init" Mar 19 09:49:37.036043 master-0 kubenswrapper[15202]: I0319 09:49:37.036027 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd576b4-a565-472e-a6a4-7c14e86f9458" containerName="init" Mar 19 09:49:37.036130 master-0 kubenswrapper[15202]: I0319 09:49:37.036119 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" containerName="dnsmasq-dns" Mar 19 09:49:37.037332 master-0 kubenswrapper[15202]: I0319 09:49:37.037316 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.133032 master-0 kubenswrapper[15202]: I0319 09:49:37.132905 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-dns-svc\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.133283 master-0 kubenswrapper[15202]: I0319 09:49:37.133208 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-config\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.133534 master-0 kubenswrapper[15202]: I0319 09:49:37.133452 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gr8b\" (UniqueName: \"kubernetes.io/projected/d7a14756-b516-4318-ba07-94afdd584022-kube-api-access-6gr8b\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.155202 master-0 kubenswrapper[15202]: I0319 09:49:37.155141 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-2qz2r"] Mar 19 09:49:37.239123 master-0 kubenswrapper[15202]: I0319 09:49:37.237892 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-config\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.239123 master-0 kubenswrapper[15202]: I0319 09:49:37.236735 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-config\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.239123 master-0 kubenswrapper[15202]: I0319 09:49:37.238092 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gr8b\" (UniqueName: \"kubernetes.io/projected/d7a14756-b516-4318-ba07-94afdd584022-kube-api-access-6gr8b\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.239123 master-0 kubenswrapper[15202]: I0319 09:49:37.238796 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-dns-svc\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.240392 master-0 kubenswrapper[15202]: I0319 09:49:37.239575 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-dns-svc\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.251952 master-0 kubenswrapper[15202]: I0319 09:49:37.250748 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-8tphm"] Mar 19 09:49:37.263256 master-0 kubenswrapper[15202]: I0319 09:49:37.263181 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gr8b\" (UniqueName: \"kubernetes.io/projected/d7a14756-b516-4318-ba07-94afdd584022-kube-api-access-6gr8b\") pod \"dnsmasq-dns-7bb8ffc699-2qz2r\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.265665 master-0 kubenswrapper[15202]: I0319 09:49:37.265608 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-8tphm"] Mar 19 09:49:37.520868 master-0 kubenswrapper[15202]: I0319 09:49:37.520813 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:37.791668 master-0 kubenswrapper[15202]: I0319 09:49:37.791568 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5739f34d-56e3-4305-8f93-bf6d6636f5e6","Type":"ContainerStarted","Data":"59123f7197f629744fef3ef335255202c119821d02484fa6fb70a1a91fafb6a0"} Mar 19 09:49:37.797653 master-0 kubenswrapper[15202]: I0319 09:49:37.796944 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"756b7d24-4b2a-48d2-b574-c0a2f3f9a411","Type":"ContainerStarted","Data":"840cbf3f18e1bb8a2c33526ce5e508e0de20fa3f652d43915b42352f12c7c7f7"} Mar 19 09:49:37.803867 master-0 kubenswrapper[15202]: I0319 09:49:37.803546 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"534e2f72-f4ac-40f2-8dad-a1100e7c67b1","Type":"ContainerStarted","Data":"e280f0427cd6bb2e7ff3331b9049d43fec0a6e851baebed276c344bdd5f32a36"} Mar 19 09:49:37.815798 master-0 kubenswrapper[15202]: I0319 09:49:37.815049 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a491330a-0016-4f3a-b003-bb80733aaaab","Type":"ContainerStarted","Data":"b7c3c7bcdee59d2ffe8a6b6ed098c3f2512ddc92e1de51c83dc5ce2a9403209a"} Mar 19 09:49:37.820392 master-0 kubenswrapper[15202]: I0319 09:49:37.819555 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.606409176 podStartE2EDuration="29.819533551s" podCreationTimestamp="2026-03-19 09:49:08 +0000 UTC" firstStartedPulling="2026-03-19 09:49:23.905082659 +0000 UTC m=+1481.290497465" lastFinishedPulling="2026-03-19 09:49:37.118207024 +0000 UTC m=+1494.503621840" observedRunningTime="2026-03-19 09:49:37.815288557 +0000 UTC m=+1495.200703383" watchObservedRunningTime="2026-03-19 09:49:37.819533551 +0000 UTC m=+1495.204948367" Mar 19 09:49:37.846571 master-0 kubenswrapper[15202]: I0319 09:49:37.846263 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.074291121 podStartE2EDuration="32.8462421s" podCreationTimestamp="2026-03-19 09:49:05 +0000 UTC" firstStartedPulling="2026-03-19 09:49:23.227079437 +0000 UTC m=+1480.612494253" lastFinishedPulling="2026-03-19 09:49:36.999030416 +0000 UTC m=+1494.384445232" observedRunningTime="2026-03-19 09:49:37.836241823 +0000 UTC m=+1495.221656639" watchObservedRunningTime="2026-03-19 09:49:37.8462421 +0000 UTC m=+1495.231656916" Mar 19 09:49:37.900095 master-0 kubenswrapper[15202]: I0319 09:49:37.897852 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=31.92557172 podStartE2EDuration="39.897825901s" podCreationTimestamp="2026-03-19 09:48:58 +0000 UTC" firstStartedPulling="2026-03-19 09:49:21.717579768 +0000 UTC m=+1479.102994584" lastFinishedPulling="2026-03-19 09:49:29.689833939 +0000 UTC m=+1487.075248765" observedRunningTime="2026-03-19 09:49:37.858994414 +0000 UTC m=+1495.244409230" watchObservedRunningTime="2026-03-19 09:49:37.897825901 +0000 UTC m=+1495.283240727" Mar 19 09:49:37.933561 master-0 kubenswrapper[15202]: I0319 09:49:37.931822 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.295747966 podStartE2EDuration="38.931797908s" podCreationTimestamp="2026-03-19 09:48:59 +0000 UTC" firstStartedPulling="2026-03-19 09:49:21.03281919 +0000 UTC m=+1478.418234006" lastFinishedPulling="2026-03-19 09:49:29.668869132 +0000 UTC m=+1487.054283948" observedRunningTime="2026-03-19 09:49:37.924846997 +0000 UTC m=+1495.310270653" watchObservedRunningTime="2026-03-19 09:49:37.931797908 +0000 UTC m=+1495.317212724" Mar 19 09:49:38.040259 master-0 kubenswrapper[15202]: I0319 09:49:38.040012 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-2qz2r"] Mar 19 09:49:38.042904 master-0 kubenswrapper[15202]: W0319 09:49:38.042858 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7a14756_b516_4318_ba07_94afdd584022.slice/crio-9feca4a6d068273d1b5aa7fb7dfaa2d8b78e58def58954aaf291cd44f2953e83 WatchSource:0}: Error finding container 9feca4a6d068273d1b5aa7fb7dfaa2d8b78e58def58954aaf291cd44f2953e83: Status 404 returned error can't find the container with id 9feca4a6d068273d1b5aa7fb7dfaa2d8b78e58def58954aaf291cd44f2953e83 Mar 19 09:49:38.827759 master-0 kubenswrapper[15202]: I0319 09:49:38.827665 15202 generic.go:334] "Generic (PLEG): container finished" podID="d7a14756-b516-4318-ba07-94afdd584022" containerID="617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4" exitCode=0 Mar 19 09:49:38.829787 master-0 kubenswrapper[15202]: I0319 09:49:38.828788 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784cfe54-5ee5-4c81-a106-d785d5803e58" path="/var/lib/kubelet/pods/784cfe54-5ee5-4c81-a106-d785d5803e58/volumes" Mar 19 09:49:38.829787 master-0 kubenswrapper[15202]: I0319 09:49:38.829424 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" event={"ID":"d7a14756-b516-4318-ba07-94afdd584022","Type":"ContainerDied","Data":"617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4"} Mar 19 09:49:38.829787 master-0 kubenswrapper[15202]: I0319 09:49:38.829453 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" event={"ID":"d7a14756-b516-4318-ba07-94afdd584022","Type":"ContainerStarted","Data":"9feca4a6d068273d1b5aa7fb7dfaa2d8b78e58def58954aaf291cd44f2953e83"} Mar 19 09:49:39.108033 master-0 kubenswrapper[15202]: I0319 09:49:39.107890 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 09:49:39.116258 master-0 kubenswrapper[15202]: I0319 09:49:39.116186 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 09:49:39.122055 master-0 kubenswrapper[15202]: I0319 09:49:39.122007 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 09:49:39.122055 master-0 kubenswrapper[15202]: I0319 09:49:39.122033 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 09:49:39.122304 master-0 kubenswrapper[15202]: I0319 09:49:39.122030 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 09:49:39.139486 master-0 kubenswrapper[15202]: I0319 09:49:39.138773 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k2tq\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-kube-api-access-7k2tq\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.139486 master-0 kubenswrapper[15202]: I0319 09:49:39.138870 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.139486 master-0 kubenswrapper[15202]: I0319 09:49:39.138930 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d9c99748-0ca1-4e25-947a-801c2e8748f5-lock\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.139486 master-0 kubenswrapper[15202]: I0319 09:49:39.138950 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c99748-0ca1-4e25-947a-801c2e8748f5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.139486 master-0 kubenswrapper[15202]: I0319 09:49:39.138993 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d9c99748-0ca1-4e25-947a-801c2e8748f5-cache\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.139486 master-0 kubenswrapper[15202]: I0319 09:49:39.139014 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d5fb6d2-3b43-49ba-ba81-25e6cdfebfd2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7f360703-01f1-49e9-9811-8d9dce81c720\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.150607 master-0 kubenswrapper[15202]: I0319 09:49:39.148842 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 09:49:39.181307 master-0 kubenswrapper[15202]: I0319 09:49:39.181235 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:39.181307 master-0 kubenswrapper[15202]: I0319 09:49:39.181305 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:39.225583 master-0 kubenswrapper[15202]: I0319 09:49:39.225531 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:39.241586 master-0 kubenswrapper[15202]: I0319 09:49:39.241315 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d9c99748-0ca1-4e25-947a-801c2e8748f5-cache\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.241586 master-0 kubenswrapper[15202]: I0319 09:49:39.241415 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d5fb6d2-3b43-49ba-ba81-25e6cdfebfd2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7f360703-01f1-49e9-9811-8d9dce81c720\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.241586 master-0 kubenswrapper[15202]: I0319 09:49:39.241518 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k2tq\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-kube-api-access-7k2tq\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.241885 master-0 kubenswrapper[15202]: I0319 09:49:39.241610 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.241885 master-0 kubenswrapper[15202]: I0319 09:49:39.241677 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d9c99748-0ca1-4e25-947a-801c2e8748f5-lock\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.241885 master-0 kubenswrapper[15202]: I0319 09:49:39.241696 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c99748-0ca1-4e25-947a-801c2e8748f5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.241975 master-0 kubenswrapper[15202]: I0319 09:49:39.241882 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d9c99748-0ca1-4e25-947a-801c2e8748f5-cache\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.242014 master-0 kubenswrapper[15202]: E0319 09:49:39.241983 15202 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:49:39.242014 master-0 kubenswrapper[15202]: E0319 09:49:39.241998 15202 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:49:39.242082 master-0 kubenswrapper[15202]: E0319 09:49:39.242040 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift podName:d9c99748-0ca1-4e25-947a-801c2e8748f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:49:39.742024505 +0000 UTC m=+1497.127439321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift") pod "swift-storage-0" (UID: "d9c99748-0ca1-4e25-947a-801c2e8748f5") : configmap "swift-ring-files" not found Mar 19 09:49:39.242815 master-0 kubenswrapper[15202]: I0319 09:49:39.242782 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d9c99748-0ca1-4e25-947a-801c2e8748f5-lock\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.243583 master-0 kubenswrapper[15202]: I0319 09:49:39.243541 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:39.243660 master-0 kubenswrapper[15202]: I0319 09:49:39.243600 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d5fb6d2-3b43-49ba-ba81-25e6cdfebfd2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7f360703-01f1-49e9-9811-8d9dce81c720\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2dd079a59803aa9a6757835076c35b9f9b36af9c40048a9c6496d50adbc39224/globalmount\"" pod="openstack/swift-storage-0" Mar 19 09:49:39.245474 master-0 kubenswrapper[15202]: I0319 09:49:39.245423 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c99748-0ca1-4e25-947a-801c2e8748f5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.296296 master-0 kubenswrapper[15202]: I0319 09:49:39.293441 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k2tq\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-kube-api-access-7k2tq\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.343229 master-0 kubenswrapper[15202]: I0319 09:49:39.343155 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:39.398029 master-0 kubenswrapper[15202]: I0319 09:49:39.397877 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:39.728227 master-0 kubenswrapper[15202]: I0319 09:49:39.728161 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-l8hw9"] Mar 19 09:49:39.729876 master-0 kubenswrapper[15202]: I0319 09:49:39.729810 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: I0319 09:49:39.751374 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-l8hw9"] Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: I0319 09:49:39.752726 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: E0319 09:49:39.752919 15202 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: E0319 09:49:39.752938 15202 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: E0319 09:49:39.752992 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift podName:d9c99748-0ca1-4e25-947a-801c2e8748f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:49:40.752974479 +0000 UTC m=+1498.138389295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift") pod "swift-storage-0" (UID: "d9c99748-0ca1-4e25-947a-801c2e8748f5") : configmap "swift-ring-files" not found Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: I0319 09:49:39.754703 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: I0319 09:49:39.755097 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 09:49:39.759662 master-0 kubenswrapper[15202]: I0319 09:49:39.755286 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 09:49:39.842983 master-0 kubenswrapper[15202]: I0319 09:49:39.842427 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" event={"ID":"d7a14756-b516-4318-ba07-94afdd584022","Type":"ContainerStarted","Data":"8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0"} Mar 19 09:49:39.842983 master-0 kubenswrapper[15202]: I0319 09:49:39.842763 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:39.855032 master-0 kubenswrapper[15202]: I0319 09:49:39.854964 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-combined-ca-bundle\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.855175 master-0 kubenswrapper[15202]: I0319 09:49:39.855148 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-ring-data-devices\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.855218 master-0 kubenswrapper[15202]: I0319 09:49:39.855191 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wns86\" (UniqueName: \"kubernetes.io/projected/e8372380-9188-4c0f-9e75-05739d26a27c-kube-api-access-wns86\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.855375 master-0 kubenswrapper[15202]: I0319 09:49:39.855299 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-scripts\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.855620 master-0 kubenswrapper[15202]: I0319 09:49:39.855488 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-swiftconf\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.855620 master-0 kubenswrapper[15202]: I0319 09:49:39.855545 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8372380-9188-4c0f-9e75-05739d26a27c-etc-swift\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.855620 master-0 kubenswrapper[15202]: I0319 09:49:39.855583 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-dispersionconf\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.877595 master-0 kubenswrapper[15202]: I0319 09:49:39.877405 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" podStartSLOduration=3.8773796750000002 podStartE2EDuration="3.877379675s" podCreationTimestamp="2026-03-19 09:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:39.868874406 +0000 UTC m=+1497.254289222" watchObservedRunningTime="2026-03-19 09:49:39.877379675 +0000 UTC m=+1497.262794491" Mar 19 09:49:39.898906 master-0 kubenswrapper[15202]: I0319 09:49:39.898835 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 09:49:39.945530 master-0 kubenswrapper[15202]: I0319 09:49:39.940211 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 09:49:39.960664 master-0 kubenswrapper[15202]: I0319 09:49:39.958611 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-ring-data-devices\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.960664 master-0 kubenswrapper[15202]: I0319 09:49:39.958676 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wns86\" (UniqueName: \"kubernetes.io/projected/e8372380-9188-4c0f-9e75-05739d26a27c-kube-api-access-wns86\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.960664 master-0 kubenswrapper[15202]: I0319 09:49:39.958858 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-scripts\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.960664 master-0 kubenswrapper[15202]: I0319 09:49:39.958949 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-swiftconf\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.960664 master-0 kubenswrapper[15202]: I0319 09:49:39.960101 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-ring-data-devices\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.962201 master-0 kubenswrapper[15202]: I0319 09:49:39.962167 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-scripts\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.964811 master-0 kubenswrapper[15202]: I0319 09:49:39.964753 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8372380-9188-4c0f-9e75-05739d26a27c-etc-swift\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.964912 master-0 kubenswrapper[15202]: I0319 09:49:39.964824 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-dispersionconf\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.964965 master-0 kubenswrapper[15202]: I0319 09:49:39.964922 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-combined-ca-bundle\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.965032 master-0 kubenswrapper[15202]: I0319 09:49:39.964965 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8372380-9188-4c0f-9e75-05739d26a27c-etc-swift\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.966937 master-0 kubenswrapper[15202]: I0319 09:49:39.966906 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-swiftconf\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.970784 master-0 kubenswrapper[15202]: I0319 09:49:39.970737 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-combined-ca-bundle\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.974387 master-0 kubenswrapper[15202]: I0319 09:49:39.974338 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-dispersionconf\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:39.974681 master-0 kubenswrapper[15202]: I0319 09:49:39.974612 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7dlz8"] Mar 19 09:49:39.977837 master-0 kubenswrapper[15202]: I0319 09:49:39.976060 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:39.978947 master-0 kubenswrapper[15202]: I0319 09:49:39.978331 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 09:49:40.006351 master-0 kubenswrapper[15202]: I0319 09:49:40.006293 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7dlz8"] Mar 19 09:49:40.020174 master-0 kubenswrapper[15202]: I0319 09:49:40.018791 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wns86\" (UniqueName: \"kubernetes.io/projected/e8372380-9188-4c0f-9e75-05739d26a27c-kube-api-access-wns86\") pod \"swift-ring-rebalance-l8hw9\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:40.074190 master-0 kubenswrapper[15202]: I0319 09:49:40.074138 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-ovn-rundir\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.074529 master-0 kubenswrapper[15202]: I0319 09:49:40.074499 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-combined-ca-bundle\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.074733 master-0 kubenswrapper[15202]: I0319 09:49:40.074713 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8xp\" (UniqueName: \"kubernetes.io/projected/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-kube-api-access-hw8xp\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.075205 master-0 kubenswrapper[15202]: I0319 09:49:40.075179 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-ovs-rundir\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.075443 master-0 kubenswrapper[15202]: I0319 09:49:40.075422 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.075744 master-0 kubenswrapper[15202]: I0319 09:49:40.075723 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-config\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.084456 master-0 kubenswrapper[15202]: I0319 09:49:40.083451 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:40.179033 master-0 kubenswrapper[15202]: I0319 09:49:40.178367 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-ovs-rundir\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.179033 master-0 kubenswrapper[15202]: I0319 09:49:40.178842 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.179033 master-0 kubenswrapper[15202]: I0319 09:49:40.178885 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-config\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.179033 master-0 kubenswrapper[15202]: I0319 09:49:40.178963 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-ovn-rundir\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.179033 master-0 kubenswrapper[15202]: I0319 09:49:40.178997 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-combined-ca-bundle\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.179323 master-0 kubenswrapper[15202]: I0319 09:49:40.179090 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8xp\" (UniqueName: \"kubernetes.io/projected/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-kube-api-access-hw8xp\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.179437 master-0 kubenswrapper[15202]: I0319 09:49:40.178731 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-ovs-rundir\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.181086 master-0 kubenswrapper[15202]: I0319 09:49:40.180026 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-ovn-rundir\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.181747 master-0 kubenswrapper[15202]: I0319 09:49:40.181707 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-config\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.185525 master-0 kubenswrapper[15202]: I0319 09:49:40.185007 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.185720 master-0 kubenswrapper[15202]: I0319 09:49:40.185611 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-combined-ca-bundle\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.224504 master-0 kubenswrapper[15202]: I0319 09:49:40.222123 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8xp\" (UniqueName: \"kubernetes.io/projected/4ccdb24a-f249-4ca8-9f50-769bac7da7f0-kube-api-access-hw8xp\") pod \"ovn-controller-metrics-7dlz8\" (UID: \"4ccdb24a-f249-4ca8-9f50-769bac7da7f0\") " pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.297965 master-0 kubenswrapper[15202]: I0319 09:49:40.297527 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-2qz2r"] Mar 19 09:49:40.319911 master-0 kubenswrapper[15202]: I0319 09:49:40.319283 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6796764987-gtg4x"] Mar 19 09:49:40.332494 master-0 kubenswrapper[15202]: I0319 09:49:40.329365 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6796764987-gtg4x"] Mar 19 09:49:40.332494 master-0 kubenswrapper[15202]: I0319 09:49:40.329564 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.334433 master-0 kubenswrapper[15202]: I0319 09:49:40.334370 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 09:49:40.393540 master-0 kubenswrapper[15202]: I0319 09:49:40.392904 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7dlz8" Mar 19 09:49:40.490254 master-0 kubenswrapper[15202]: I0319 09:49:40.490179 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6796764987-gtg4x"] Mar 19 09:49:40.491005 master-0 kubenswrapper[15202]: E0319 09:49:40.490969 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-fktrg ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6796764987-gtg4x" podUID="483a7209-e82c-4980-a9cf-8de8f727c1ae" Mar 19 09:49:40.533972 master-0 kubenswrapper[15202]: I0319 09:49:40.527933 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fktrg\" (UniqueName: \"kubernetes.io/projected/483a7209-e82c-4980-a9cf-8de8f727c1ae-kube-api-access-fktrg\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.533972 master-0 kubenswrapper[15202]: I0319 09:49:40.528000 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.533972 master-0 kubenswrapper[15202]: I0319 09:49:40.528060 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-config\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.533972 master-0 kubenswrapper[15202]: I0319 09:49:40.528082 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-dns-svc\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.568265 master-0 kubenswrapper[15202]: I0319 09:49:40.543624 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-vtxcj"] Mar 19 09:49:40.568265 master-0 kubenswrapper[15202]: I0319 09:49:40.546226 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.568265 master-0 kubenswrapper[15202]: I0319 09:49:40.549890 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 09:49:40.616121 master-0 kubenswrapper[15202]: I0319 09:49:40.605733 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-vtxcj"] Mar 19 09:49:40.636502 master-0 kubenswrapper[15202]: I0319 09:49:40.634669 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fktrg\" (UniqueName: \"kubernetes.io/projected/483a7209-e82c-4980-a9cf-8de8f727c1ae-kube-api-access-fktrg\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.636502 master-0 kubenswrapper[15202]: I0319 09:49:40.634745 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.636502 master-0 kubenswrapper[15202]: I0319 09:49:40.634814 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-config\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.636502 master-0 kubenswrapper[15202]: I0319 09:49:40.634840 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-dns-svc\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.636803 master-0 kubenswrapper[15202]: I0319 09:49:40.636713 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-config\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.657523 master-0 kubenswrapper[15202]: I0319 09:49:40.637288 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-dns-svc\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.657523 master-0 kubenswrapper[15202]: I0319 09:49:40.639103 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.738502 master-0 kubenswrapper[15202]: I0319 09:49:40.736793 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxkx\" (UniqueName: \"kubernetes.io/projected/6345bc32-ed02-4534-830a-229d7f9e4975-kube-api-access-mbxkx\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.738502 master-0 kubenswrapper[15202]: I0319 09:49:40.736916 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.738502 master-0 kubenswrapper[15202]: I0319 09:49:40.737003 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.738502 master-0 kubenswrapper[15202]: I0319 09:49:40.737037 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-config\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.738502 master-0 kubenswrapper[15202]: I0319 09:49:40.737058 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.840621 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxkx\" (UniqueName: \"kubernetes.io/projected/6345bc32-ed02-4534-830a-229d7f9e4975-kube-api-access-mbxkx\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.840723 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.840756 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.840844 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.840876 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-config\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.840895 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: I0319 09:49:40.841765 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: E0319 09:49:40.842165 15202 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:49:40.842319 master-0 kubenswrapper[15202]: E0319 09:49:40.842180 15202 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:49:40.855483 master-0 kubenswrapper[15202]: I0319 09:49:40.855410 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-config\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.856058 master-0 kubenswrapper[15202]: I0319 09:49:40.856029 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.862188 master-0 kubenswrapper[15202]: E0319 09:49:40.862086 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift podName:d9c99748-0ca1-4e25-947a-801c2e8748f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:49:42.842201477 +0000 UTC m=+1500.227616293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift") pod "swift-storage-0" (UID: "d9c99748-0ca1-4e25-947a-801c2e8748f5") : configmap "swift-ring-files" not found Mar 19 09:49:40.863186 master-0 kubenswrapper[15202]: I0319 09:49:40.863145 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:40.871025 master-0 kubenswrapper[15202]: I0319 09:49:40.870826 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d5fb6d2-3b43-49ba-ba81-25e6cdfebfd2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7f360703-01f1-49e9-9811-8d9dce81c720\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:40.871540 master-0 kubenswrapper[15202]: I0319 09:49:40.871312 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:40.873991 master-0 kubenswrapper[15202]: I0319 09:49:40.873953 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:40.905849 master-0 kubenswrapper[15202]: I0319 09:49:40.905785 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:41.055933 master-0 kubenswrapper[15202]: I0319 09:49:41.055882 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fktrg\" (UniqueName: \"kubernetes.io/projected/483a7209-e82c-4980-a9cf-8de8f727c1ae-kube-api-access-fktrg\") pod \"dnsmasq-dns-6796764987-gtg4x\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:41.118134 master-0 kubenswrapper[15202]: I0319 09:49:41.118071 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-l8hw9"] Mar 19 09:49:41.122912 master-0 kubenswrapper[15202]: W0319 09:49:41.122853 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8372380_9188_4c0f_9e75_05739d26a27c.slice/crio-7b57876d094fd8c72acec31ca3282dff43e16d701151d700bb90e1ca94fd2cff WatchSource:0}: Error finding container 7b57876d094fd8c72acec31ca3282dff43e16d701151d700bb90e1ca94fd2cff: Status 404 returned error can't find the container with id 7b57876d094fd8c72acec31ca3282dff43e16d701151d700bb90e1ca94fd2cff Mar 19 09:49:41.139208 master-0 kubenswrapper[15202]: I0319 09:49:41.139153 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxkx\" (UniqueName: \"kubernetes.io/projected/6345bc32-ed02-4534-830a-229d7f9e4975-kube-api-access-mbxkx\") pod \"dnsmasq-dns-5bf8b865dc-vtxcj\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:41.251649 master-0 kubenswrapper[15202]: I0319 09:49:41.227420 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:41.453829 master-0 kubenswrapper[15202]: I0319 09:49:41.453770 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-config\") pod \"483a7209-e82c-4980-a9cf-8de8f727c1ae\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " Mar 19 09:49:41.454138 master-0 kubenswrapper[15202]: I0319 09:49:41.453883 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-dns-svc\") pod \"483a7209-e82c-4980-a9cf-8de8f727c1ae\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " Mar 19 09:49:41.454138 master-0 kubenswrapper[15202]: I0319 09:49:41.453949 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fktrg\" (UniqueName: \"kubernetes.io/projected/483a7209-e82c-4980-a9cf-8de8f727c1ae-kube-api-access-fktrg\") pod \"483a7209-e82c-4980-a9cf-8de8f727c1ae\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " Mar 19 09:49:41.454243 master-0 kubenswrapper[15202]: I0319 09:49:41.454214 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-ovsdbserver-nb\") pod \"483a7209-e82c-4980-a9cf-8de8f727c1ae\" (UID: \"483a7209-e82c-4980-a9cf-8de8f727c1ae\") " Mar 19 09:49:41.454448 master-0 kubenswrapper[15202]: I0319 09:49:41.454391 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "483a7209-e82c-4980-a9cf-8de8f727c1ae" (UID: "483a7209-e82c-4980-a9cf-8de8f727c1ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:41.454945 master-0 kubenswrapper[15202]: I0319 09:49:41.454912 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "483a7209-e82c-4980-a9cf-8de8f727c1ae" (UID: "483a7209-e82c-4980-a9cf-8de8f727c1ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:41.457707 master-0 kubenswrapper[15202]: I0319 09:49:41.456464 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-config" (OuterVolumeSpecName: "config") pod "483a7209-e82c-4980-a9cf-8de8f727c1ae" (UID: "483a7209-e82c-4980-a9cf-8de8f727c1ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:41.457951 master-0 kubenswrapper[15202]: I0319 09:49:41.457886 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:41.457951 master-0 kubenswrapper[15202]: I0319 09:49:41.457916 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:41.457951 master-0 kubenswrapper[15202]: I0319 09:49:41.457931 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483a7209-e82c-4980-a9cf-8de8f727c1ae-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:41.462196 master-0 kubenswrapper[15202]: I0319 09:49:41.462028 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483a7209-e82c-4980-a9cf-8de8f727c1ae-kube-api-access-fktrg" (OuterVolumeSpecName: "kube-api-access-fktrg") pod "483a7209-e82c-4980-a9cf-8de8f727c1ae" (UID: "483a7209-e82c-4980-a9cf-8de8f727c1ae"). InnerVolumeSpecName "kube-api-access-fktrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:41.559923 master-0 kubenswrapper[15202]: I0319 09:49:41.559856 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fktrg\" (UniqueName: \"kubernetes.io/projected/483a7209-e82c-4980-a9cf-8de8f727c1ae-kube-api-access-fktrg\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:41.621784 master-0 kubenswrapper[15202]: I0319 09:49:41.619855 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7dlz8"] Mar 19 09:49:41.646280 master-0 kubenswrapper[15202]: I0319 09:49:41.644631 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 09:49:41.661371 master-0 kubenswrapper[15202]: I0319 09:49:41.656242 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 09:49:41.667648 master-0 kubenswrapper[15202]: I0319 09:49:41.664043 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 09:49:41.667648 master-0 kubenswrapper[15202]: I0319 09:49:41.664610 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 09:49:41.667648 master-0 kubenswrapper[15202]: I0319 09:49:41.664758 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 09:49:41.710568 master-0 kubenswrapper[15202]: I0319 09:49:41.708240 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 09:49:41.724561 master-0 kubenswrapper[15202]: W0319 09:49:41.724405 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345bc32_ed02_4534_830a_229d7f9e4975.slice/crio-40461aab4a632ecc52ef5b4fb81785895cc871a798e5c330b78b6be17068c31a WatchSource:0}: Error finding container 40461aab4a632ecc52ef5b4fb81785895cc871a798e5c330b78b6be17068c31a: Status 404 returned error can't find the container with id 40461aab4a632ecc52ef5b4fb81785895cc871a798e5c330b78b6be17068c31a Mar 19 09:49:41.737251 master-0 kubenswrapper[15202]: I0319 09:49:41.737175 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-vtxcj"] Mar 19 09:49:41.772520 master-0 kubenswrapper[15202]: I0319 09:49:41.772409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79dsv\" (UniqueName: \"kubernetes.io/projected/b3afa041-e3b0-469b-810e-ce69f3a88264-kube-api-access-79dsv\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.772707 master-0 kubenswrapper[15202]: I0319 09:49:41.772546 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3afa041-e3b0-469b-810e-ce69f3a88264-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.772707 master-0 kubenswrapper[15202]: I0319 09:49:41.772603 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afa041-e3b0-469b-810e-ce69f3a88264-config\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.772784 master-0 kubenswrapper[15202]: I0319 09:49:41.772729 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.772784 master-0 kubenswrapper[15202]: I0319 09:49:41.772777 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.772854 master-0 kubenswrapper[15202]: I0319 09:49:41.772827 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3afa041-e3b0-469b-810e-ce69f3a88264-scripts\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.772888 master-0 kubenswrapper[15202]: I0319 09:49:41.772862 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875023 master-0 kubenswrapper[15202]: I0319 09:49:41.874980 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3afa041-e3b0-469b-810e-ce69f3a88264-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875417 master-0 kubenswrapper[15202]: I0319 09:49:41.875035 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afa041-e3b0-469b-810e-ce69f3a88264-config\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875417 master-0 kubenswrapper[15202]: I0319 09:49:41.875159 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875417 master-0 kubenswrapper[15202]: I0319 09:49:41.875197 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875417 master-0 kubenswrapper[15202]: I0319 09:49:41.875216 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3afa041-e3b0-469b-810e-ce69f3a88264-scripts\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875417 master-0 kubenswrapper[15202]: I0319 09:49:41.875245 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.875417 master-0 kubenswrapper[15202]: I0319 09:49:41.875348 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79dsv\" (UniqueName: \"kubernetes.io/projected/b3afa041-e3b0-469b-810e-ce69f3a88264-kube-api-access-79dsv\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.879242 master-0 kubenswrapper[15202]: I0319 09:49:41.879217 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afa041-e3b0-469b-810e-ce69f3a88264-config\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.879358 master-0 kubenswrapper[15202]: I0319 09:49:41.879324 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3afa041-e3b0-469b-810e-ce69f3a88264-scripts\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.879913 master-0 kubenswrapper[15202]: I0319 09:49:41.879887 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3afa041-e3b0-469b-810e-ce69f3a88264-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.883686 master-0 kubenswrapper[15202]: I0319 09:49:41.883117 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.884276 master-0 kubenswrapper[15202]: I0319 09:49:41.884244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.885348 master-0 kubenswrapper[15202]: I0319 09:49:41.885315 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3afa041-e3b0-469b-810e-ce69f3a88264-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.890116 master-0 kubenswrapper[15202]: I0319 09:49:41.890059 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" event={"ID":"6345bc32-ed02-4534-830a-229d7f9e4975","Type":"ContainerStarted","Data":"40461aab4a632ecc52ef5b4fb81785895cc871a798e5c330b78b6be17068c31a"} Mar 19 09:49:41.894123 master-0 kubenswrapper[15202]: I0319 09:49:41.894079 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79dsv\" (UniqueName: \"kubernetes.io/projected/b3afa041-e3b0-469b-810e-ce69f3a88264-kube-api-access-79dsv\") pod \"ovn-northd-0\" (UID: \"b3afa041-e3b0-469b-810e-ce69f3a88264\") " pod="openstack/ovn-northd-0" Mar 19 09:49:41.894481 master-0 kubenswrapper[15202]: I0319 09:49:41.894418 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l8hw9" event={"ID":"e8372380-9188-4c0f-9e75-05739d26a27c","Type":"ContainerStarted","Data":"7b57876d094fd8c72acec31ca3282dff43e16d701151d700bb90e1ca94fd2cff"} Mar 19 09:49:41.908164 master-0 kubenswrapper[15202]: I0319 09:49:41.908097 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6796764987-gtg4x" Mar 19 09:49:41.909525 master-0 kubenswrapper[15202]: I0319 09:49:41.909406 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7dlz8" event={"ID":"4ccdb24a-f249-4ca8-9f50-769bac7da7f0","Type":"ContainerStarted","Data":"e4b829709d522a2a0d6e536b65ed2f12648b4f9c72da6c757bef0bb39129bc86"} Mar 19 09:49:41.911132 master-0 kubenswrapper[15202]: I0319 09:49:41.911086 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" podUID="d7a14756-b516-4318-ba07-94afdd584022" containerName="dnsmasq-dns" containerID="cri-o://8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0" gracePeriod=10 Mar 19 09:49:41.977599 master-0 kubenswrapper[15202]: I0319 09:49:41.977443 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6796764987-gtg4x"] Mar 19 09:49:41.986897 master-0 kubenswrapper[15202]: I0319 09:49:41.986812 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6796764987-gtg4x"] Mar 19 09:49:42.036317 master-0 kubenswrapper[15202]: I0319 09:49:42.036256 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 09:49:42.682691 master-0 kubenswrapper[15202]: I0319 09:49:42.680711 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 09:49:42.750844 master-0 kubenswrapper[15202]: I0319 09:49:42.750789 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:42.852028 master-0 kubenswrapper[15202]: I0319 09:49:42.851863 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483a7209-e82c-4980-a9cf-8de8f727c1ae" path="/var/lib/kubelet/pods/483a7209-e82c-4980-a9cf-8de8f727c1ae/volumes" Mar 19 09:49:42.918833 master-0 kubenswrapper[15202]: I0319 09:49:42.916802 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-config\") pod \"d7a14756-b516-4318-ba07-94afdd584022\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " Mar 19 09:49:42.918833 master-0 kubenswrapper[15202]: I0319 09:49:42.917104 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gr8b\" (UniqueName: \"kubernetes.io/projected/d7a14756-b516-4318-ba07-94afdd584022-kube-api-access-6gr8b\") pod \"d7a14756-b516-4318-ba07-94afdd584022\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " Mar 19 09:49:42.918833 master-0 kubenswrapper[15202]: I0319 09:49:42.917175 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-dns-svc\") pod \"d7a14756-b516-4318-ba07-94afdd584022\" (UID: \"d7a14756-b516-4318-ba07-94afdd584022\") " Mar 19 09:49:42.918833 master-0 kubenswrapper[15202]: I0319 09:49:42.917882 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:42.924603 master-0 kubenswrapper[15202]: E0319 09:49:42.922808 15202 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:49:42.924603 master-0 kubenswrapper[15202]: E0319 09:49:42.922845 15202 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:49:42.924603 master-0 kubenswrapper[15202]: E0319 09:49:42.922898 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift podName:d9c99748-0ca1-4e25-947a-801c2e8748f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:49:46.922881955 +0000 UTC m=+1504.308296771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift") pod "swift-storage-0" (UID: "d9c99748-0ca1-4e25-947a-801c2e8748f5") : configmap "swift-ring-files" not found Mar 19 09:49:42.924603 master-0 kubenswrapper[15202]: I0319 09:49:42.924202 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a14756-b516-4318-ba07-94afdd584022-kube-api-access-6gr8b" (OuterVolumeSpecName: "kube-api-access-6gr8b") pod "d7a14756-b516-4318-ba07-94afdd584022" (UID: "d7a14756-b516-4318-ba07-94afdd584022"). InnerVolumeSpecName "kube-api-access-6gr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:42.936112 master-0 kubenswrapper[15202]: I0319 09:49:42.935452 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7dlz8" event={"ID":"4ccdb24a-f249-4ca8-9f50-769bac7da7f0","Type":"ContainerStarted","Data":"99d08f4dafda38ab9c13a4d15fb0a056473f3ae759d8392d4c9bb0b4fec00ec2"} Mar 19 09:49:42.939760 master-0 kubenswrapper[15202]: I0319 09:49:42.939055 15202 generic.go:334] "Generic (PLEG): container finished" podID="6345bc32-ed02-4534-830a-229d7f9e4975" containerID="f879d65e6b05b29ec0ecdb7a3aef03b0c5ba8763137a1b1c2ef6ee3fa4086b25" exitCode=0 Mar 19 09:49:42.939760 master-0 kubenswrapper[15202]: I0319 09:49:42.939114 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" event={"ID":"6345bc32-ed02-4534-830a-229d7f9e4975","Type":"ContainerDied","Data":"f879d65e6b05b29ec0ecdb7a3aef03b0c5ba8763137a1b1c2ef6ee3fa4086b25"} Mar 19 09:49:42.949547 master-0 kubenswrapper[15202]: I0319 09:49:42.949436 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" event={"ID":"d7a14756-b516-4318-ba07-94afdd584022","Type":"ContainerDied","Data":"8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0"} Mar 19 09:49:42.949547 master-0 kubenswrapper[15202]: I0319 09:49:42.949511 15202 scope.go:117] "RemoveContainer" containerID="8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0" Mar 19 09:49:42.949756 master-0 kubenswrapper[15202]: I0319 09:49:42.949713 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" Mar 19 09:49:42.950423 master-0 kubenswrapper[15202]: I0319 09:49:42.949996 15202 generic.go:334] "Generic (PLEG): container finished" podID="d7a14756-b516-4318-ba07-94afdd584022" containerID="8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0" exitCode=0 Mar 19 09:49:42.950423 master-0 kubenswrapper[15202]: I0319 09:49:42.950058 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-2qz2r" event={"ID":"d7a14756-b516-4318-ba07-94afdd584022","Type":"ContainerDied","Data":"9feca4a6d068273d1b5aa7fb7dfaa2d8b78e58def58954aaf291cd44f2953e83"} Mar 19 09:49:42.951858 master-0 kubenswrapper[15202]: I0319 09:49:42.951486 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b3afa041-e3b0-469b-810e-ce69f3a88264","Type":"ContainerStarted","Data":"c97b6e679af7e034e1c48d769d2c8e84866eeb2236e552763255d338718de0e0"} Mar 19 09:49:42.988129 master-0 kubenswrapper[15202]: I0319 09:49:42.988088 15202 scope.go:117] "RemoveContainer" containerID="617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4" Mar 19 09:49:43.019635 master-0 kubenswrapper[15202]: I0319 09:49:43.019591 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-config" (OuterVolumeSpecName: "config") pod "d7a14756-b516-4318-ba07-94afdd584022" (UID: "d7a14756-b516-4318-ba07-94afdd584022"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:43.021323 master-0 kubenswrapper[15202]: I0319 09:49:43.021260 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gr8b\" (UniqueName: \"kubernetes.io/projected/d7a14756-b516-4318-ba07-94afdd584022-kube-api-access-6gr8b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:43.025210 master-0 kubenswrapper[15202]: I0319 09:49:43.025090 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7a14756-b516-4318-ba07-94afdd584022" (UID: "d7a14756-b516-4318-ba07-94afdd584022"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:43.025578 master-0 kubenswrapper[15202]: I0319 09:49:43.025523 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7dlz8" podStartSLOduration=4.025506625 podStartE2EDuration="4.025506625s" podCreationTimestamp="2026-03-19 09:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:42.994921781 +0000 UTC m=+1500.380336597" watchObservedRunningTime="2026-03-19 09:49:43.025506625 +0000 UTC m=+1500.410921441" Mar 19 09:49:43.124558 master-0 kubenswrapper[15202]: I0319 09:49:43.123671 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:43.124558 master-0 kubenswrapper[15202]: I0319 09:49:43.123713 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7a14756-b516-4318-ba07-94afdd584022-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:43.326567 master-0 kubenswrapper[15202]: I0319 09:49:43.324616 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-2qz2r"] Mar 19 09:49:43.339321 master-0 kubenswrapper[15202]: I0319 09:49:43.339242 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-2qz2r"] Mar 19 09:49:43.824715 master-0 kubenswrapper[15202]: I0319 09:49:43.824656 15202 trace.go:236] Trace[383153071]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (19-Mar-2026 09:49:42.811) (total time: 1013ms): Mar 19 09:49:43.824715 master-0 kubenswrapper[15202]: Trace[383153071]: [1.013295907s] [1.013295907s] END Mar 19 09:49:43.981802 master-0 kubenswrapper[15202]: I0319 09:49:43.980880 15202 trace.go:236] Trace[1094274362]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (19-Mar-2026 09:49:42.810) (total time: 1170ms): Mar 19 09:49:43.981802 master-0 kubenswrapper[15202]: Trace[1094274362]: [1.170321118s] [1.170321118s] END Mar 19 09:49:44.824172 master-0 kubenswrapper[15202]: I0319 09:49:44.824016 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a14756-b516-4318-ba07-94afdd584022" path="/var/lib/kubelet/pods/d7a14756-b516-4318-ba07-94afdd584022/volumes" Mar 19 09:49:45.032989 master-0 kubenswrapper[15202]: I0319 09:49:45.032896 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 09:49:45.033599 master-0 kubenswrapper[15202]: I0319 09:49:45.033035 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 09:49:45.121366 master-0 kubenswrapper[15202]: I0319 09:49:45.121241 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 09:49:45.847928 master-0 kubenswrapper[15202]: I0319 09:49:45.847864 15202 scope.go:117] "RemoveContainer" containerID="8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0" Mar 19 09:49:45.848560 master-0 kubenswrapper[15202]: E0319 09:49:45.848508 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0\": container with ID starting with 8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0 not found: ID does not exist" containerID="8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0" Mar 19 09:49:45.848622 master-0 kubenswrapper[15202]: I0319 09:49:45.848568 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0"} err="failed to get container status \"8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0\": rpc error: code = NotFound desc = could not find container \"8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0\": container with ID starting with 8add6faf7af5d792b8ce15169779e3f7419b96b09bcb1d65b5a7b7dbe96e8ac0 not found: ID does not exist" Mar 19 09:49:45.848622 master-0 kubenswrapper[15202]: I0319 09:49:45.848604 15202 scope.go:117] "RemoveContainer" containerID="617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4" Mar 19 09:49:45.849163 master-0 kubenswrapper[15202]: E0319 09:49:45.849130 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4\": container with ID starting with 617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4 not found: ID does not exist" containerID="617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4" Mar 19 09:49:45.849219 master-0 kubenswrapper[15202]: I0319 09:49:45.849157 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4"} err="failed to get container status \"617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4\": rpc error: code = NotFound desc = could not find container \"617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4\": container with ID starting with 617b2d5f3a0333e3160000719f506947d060132c38aef803e668e46f957b29d4 not found: ID does not exist" Mar 19 09:49:46.086876 master-0 kubenswrapper[15202]: I0319 09:49:46.086762 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 09:49:46.434367 master-0 kubenswrapper[15202]: I0319 09:49:46.434211 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:46.434367 master-0 kubenswrapper[15202]: I0319 09:49:46.434278 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:46.512414 master-0 kubenswrapper[15202]: I0319 09:49:46.511458 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:46.916872 master-0 kubenswrapper[15202]: I0319 09:49:46.916813 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fc3e-account-create-update-btzjb"] Mar 19 09:49:46.917483 master-0 kubenswrapper[15202]: E0319 09:49:46.917420 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a14756-b516-4318-ba07-94afdd584022" containerName="init" Mar 19 09:49:46.917483 master-0 kubenswrapper[15202]: I0319 09:49:46.917443 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a14756-b516-4318-ba07-94afdd584022" containerName="init" Mar 19 09:49:46.919349 master-0 kubenswrapper[15202]: E0319 09:49:46.917515 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a14756-b516-4318-ba07-94afdd584022" containerName="dnsmasq-dns" Mar 19 09:49:46.919349 master-0 kubenswrapper[15202]: I0319 09:49:46.919344 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a14756-b516-4318-ba07-94afdd584022" containerName="dnsmasq-dns" Mar 19 09:49:46.919693 master-0 kubenswrapper[15202]: I0319 09:49:46.919670 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a14756-b516-4318-ba07-94afdd584022" containerName="dnsmasq-dns" Mar 19 09:49:46.920434 master-0 kubenswrapper[15202]: I0319 09:49:46.920410 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:46.922379 master-0 kubenswrapper[15202]: I0319 09:49:46.922335 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 09:49:46.928010 master-0 kubenswrapper[15202]: I0319 09:49:46.927943 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:46.928196 master-0 kubenswrapper[15202]: E0319 09:49:46.928159 15202 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:49:46.928196 master-0 kubenswrapper[15202]: E0319 09:49:46.928179 15202 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:49:46.928271 master-0 kubenswrapper[15202]: E0319 09:49:46.928227 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift podName:d9c99748-0ca1-4e25-947a-801c2e8748f5 nodeName:}" failed. No retries permitted until 2026-03-19 09:49:54.928210662 +0000 UTC m=+1512.313625478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift") pod "swift-storage-0" (UID: "d9c99748-0ca1-4e25-947a-801c2e8748f5") : configmap "swift-ring-files" not found Mar 19 09:49:46.931746 master-0 kubenswrapper[15202]: I0319 09:49:46.931672 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fc3e-account-create-update-btzjb"] Mar 19 09:49:46.999970 master-0 kubenswrapper[15202]: I0319 09:49:46.999846 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nj4vf"] Mar 19 09:49:47.003941 master-0 kubenswrapper[15202]: I0319 09:49:47.003891 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.005027 master-0 kubenswrapper[15202]: I0319 09:49:47.004963 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b3afa041-e3b0-469b-810e-ce69f3a88264","Type":"ContainerStarted","Data":"abad534c4f7c605a8cb0f6d38d94335c9bc71b244db36b519aa6cb74adbcc487"} Mar 19 09:49:47.005099 master-0 kubenswrapper[15202]: I0319 09:49:47.005032 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b3afa041-e3b0-469b-810e-ce69f3a88264","Type":"ContainerStarted","Data":"0e3242f9a2716d7dee95ab57c9793279d718ec55110c2106088d4fd22017c232"} Mar 19 09:49:47.005099 master-0 kubenswrapper[15202]: I0319 09:49:47.005090 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 09:49:47.009337 master-0 kubenswrapper[15202]: I0319 09:49:47.009281 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" event={"ID":"6345bc32-ed02-4534-830a-229d7f9e4975","Type":"ContainerStarted","Data":"ec2e6f23bffac906678d7bfaaa7e31b24920ce9996b1ce96040da8018a7a078d"} Mar 19 09:49:47.009900 master-0 kubenswrapper[15202]: I0319 09:49:47.009857 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:47.010862 master-0 kubenswrapper[15202]: I0319 09:49:47.010752 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nj4vf"] Mar 19 09:49:47.011137 master-0 kubenswrapper[15202]: I0319 09:49:47.011081 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l8hw9" event={"ID":"e8372380-9188-4c0f-9e75-05739d26a27c","Type":"ContainerStarted","Data":"c8843573ed1aa244dac508e291e031e81054ef65f66461b2b12bc4e1f57fed52"} Mar 19 09:49:47.031042 master-0 kubenswrapper[15202]: I0319 09:49:47.030921 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h22h\" (UniqueName: \"kubernetes.io/projected/db9a43e6-7b74-426b-83ee-50df7a0270e7-kube-api-access-9h22h\") pod \"glance-db-create-nj4vf\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.031223 master-0 kubenswrapper[15202]: I0319 09:49:47.031076 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldhf\" (UniqueName: \"kubernetes.io/projected/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-kube-api-access-tldhf\") pod \"glance-fc3e-account-create-update-btzjb\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.031977 master-0 kubenswrapper[15202]: I0319 09:49:47.031941 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-operator-scripts\") pod \"glance-fc3e-account-create-update-btzjb\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.032147 master-0 kubenswrapper[15202]: I0319 09:49:47.032098 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db9a43e6-7b74-426b-83ee-50df7a0270e7-operator-scripts\") pod \"glance-db-create-nj4vf\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.049503 master-0 kubenswrapper[15202]: I0319 09:49:47.048344 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.180985657 podStartE2EDuration="6.048298752s" podCreationTimestamp="2026-03-19 09:49:41 +0000 UTC" firstStartedPulling="2026-03-19 09:49:42.69223858 +0000 UTC m=+1500.077653396" lastFinishedPulling="2026-03-19 09:49:46.559551665 +0000 UTC m=+1503.944966491" observedRunningTime="2026-03-19 09:49:47.041907844 +0000 UTC m=+1504.427322680" watchObservedRunningTime="2026-03-19 09:49:47.048298752 +0000 UTC m=+1504.433713568" Mar 19 09:49:47.066644 master-0 kubenswrapper[15202]: I0319 09:49:47.066544 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" podStartSLOduration=7.066523211 podStartE2EDuration="7.066523211s" podCreationTimestamp="2026-03-19 09:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:47.064293516 +0000 UTC m=+1504.449708332" watchObservedRunningTime="2026-03-19 09:49:47.066523211 +0000 UTC m=+1504.451938027" Mar 19 09:49:47.088079 master-0 kubenswrapper[15202]: I0319 09:49:47.087944 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-l8hw9" podStartSLOduration=3.275901997 podStartE2EDuration="8.087918568s" podCreationTimestamp="2026-03-19 09:49:39 +0000 UTC" firstStartedPulling="2026-03-19 09:49:41.125997703 +0000 UTC m=+1498.511412519" lastFinishedPulling="2026-03-19 09:49:45.938014274 +0000 UTC m=+1503.323429090" observedRunningTime="2026-03-19 09:49:47.083360216 +0000 UTC m=+1504.468775042" watchObservedRunningTime="2026-03-19 09:49:47.087918568 +0000 UTC m=+1504.473333384" Mar 19 09:49:47.101879 master-0 kubenswrapper[15202]: I0319 09:49:47.097190 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 09:49:47.133954 master-0 kubenswrapper[15202]: I0319 09:49:47.133882 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h22h\" (UniqueName: \"kubernetes.io/projected/db9a43e6-7b74-426b-83ee-50df7a0270e7-kube-api-access-9h22h\") pod \"glance-db-create-nj4vf\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.134202 master-0 kubenswrapper[15202]: I0319 09:49:47.133971 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldhf\" (UniqueName: \"kubernetes.io/projected/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-kube-api-access-tldhf\") pod \"glance-fc3e-account-create-update-btzjb\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.134202 master-0 kubenswrapper[15202]: I0319 09:49:47.134192 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-operator-scripts\") pod \"glance-fc3e-account-create-update-btzjb\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.134313 master-0 kubenswrapper[15202]: I0319 09:49:47.134265 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db9a43e6-7b74-426b-83ee-50df7a0270e7-operator-scripts\") pod \"glance-db-create-nj4vf\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.140247 master-0 kubenswrapper[15202]: I0319 09:49:47.140204 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-operator-scripts\") pod \"glance-fc3e-account-create-update-btzjb\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.153895 master-0 kubenswrapper[15202]: I0319 09:49:47.152199 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db9a43e6-7b74-426b-83ee-50df7a0270e7-operator-scripts\") pod \"glance-db-create-nj4vf\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.156178 master-0 kubenswrapper[15202]: I0319 09:49:47.156143 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h22h\" (UniqueName: \"kubernetes.io/projected/db9a43e6-7b74-426b-83ee-50df7a0270e7-kube-api-access-9h22h\") pod \"glance-db-create-nj4vf\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.159941 master-0 kubenswrapper[15202]: I0319 09:49:47.159904 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldhf\" (UniqueName: \"kubernetes.io/projected/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-kube-api-access-tldhf\") pod \"glance-fc3e-account-create-update-btzjb\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.236984 master-0 kubenswrapper[15202]: I0319 09:49:47.236916 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:47.339936 master-0 kubenswrapper[15202]: I0319 09:49:47.339353 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:47.913240 master-0 kubenswrapper[15202]: I0319 09:49:47.913182 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fc3e-account-create-update-btzjb"] Mar 19 09:49:47.929629 master-0 kubenswrapper[15202]: I0319 09:49:47.929569 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nj4vf"] Mar 19 09:49:48.026952 master-0 kubenswrapper[15202]: I0319 09:49:48.026883 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nj4vf" event={"ID":"db9a43e6-7b74-426b-83ee-50df7a0270e7","Type":"ContainerStarted","Data":"4744293276cb211e17492c2d3dd4d7c90d99a8a780a92191569f090a5e39af7b"} Mar 19 09:49:48.029153 master-0 kubenswrapper[15202]: I0319 09:49:48.029102 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc3e-account-create-update-btzjb" event={"ID":"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817","Type":"ContainerStarted","Data":"200c99c05c35d3fa2654de6fdaba38130f82c8aa6f971c320df2543d3b15cbc4"} Mar 19 09:49:48.392208 master-0 kubenswrapper[15202]: I0319 09:49:48.392124 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k88tf"] Mar 19 09:49:48.394242 master-0 kubenswrapper[15202]: I0319 09:49:48.394204 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.399166 master-0 kubenswrapper[15202]: I0319 09:49:48.399092 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 09:49:48.406059 master-0 kubenswrapper[15202]: I0319 09:49:48.405991 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k88tf"] Mar 19 09:49:48.573411 master-0 kubenswrapper[15202]: I0319 09:49:48.573317 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24qx\" (UniqueName: \"kubernetes.io/projected/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-kube-api-access-j24qx\") pod \"root-account-create-update-k88tf\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.573796 master-0 kubenswrapper[15202]: I0319 09:49:48.573499 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-operator-scripts\") pod \"root-account-create-update-k88tf\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.675233 master-0 kubenswrapper[15202]: I0319 09:49:48.675146 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-operator-scripts\") pod \"root-account-create-update-k88tf\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.675656 master-0 kubenswrapper[15202]: I0319 09:49:48.675344 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j24qx\" (UniqueName: \"kubernetes.io/projected/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-kube-api-access-j24qx\") pod \"root-account-create-update-k88tf\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.676015 master-0 kubenswrapper[15202]: I0319 09:49:48.675975 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-operator-scripts\") pod \"root-account-create-update-k88tf\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.695758 master-0 kubenswrapper[15202]: I0319 09:49:48.695632 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24qx\" (UniqueName: \"kubernetes.io/projected/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-kube-api-access-j24qx\") pod \"root-account-create-update-k88tf\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:48.749570 master-0 kubenswrapper[15202]: I0319 09:49:48.749235 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:49.041863 master-0 kubenswrapper[15202]: I0319 09:49:49.041701 15202 generic.go:334] "Generic (PLEG): container finished" podID="ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" containerID="d9bce657371028171095e39d0d257d8bf1207d54602f1af9a00652c4bfc9ae5a" exitCode=0 Mar 19 09:49:49.042367 master-0 kubenswrapper[15202]: I0319 09:49:49.042314 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc3e-account-create-update-btzjb" event={"ID":"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817","Type":"ContainerDied","Data":"d9bce657371028171095e39d0d257d8bf1207d54602f1af9a00652c4bfc9ae5a"} Mar 19 09:49:49.045768 master-0 kubenswrapper[15202]: I0319 09:49:49.045713 15202 generic.go:334] "Generic (PLEG): container finished" podID="db9a43e6-7b74-426b-83ee-50df7a0270e7" containerID="482c2e0767034c6d9ed269e91940e6a2f51314a39b85e17162a1995591d2d1f8" exitCode=0 Mar 19 09:49:49.045982 master-0 kubenswrapper[15202]: I0319 09:49:49.045767 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nj4vf" event={"ID":"db9a43e6-7b74-426b-83ee-50df7a0270e7","Type":"ContainerDied","Data":"482c2e0767034c6d9ed269e91940e6a2f51314a39b85e17162a1995591d2d1f8"} Mar 19 09:49:49.235351 master-0 kubenswrapper[15202]: I0319 09:49:49.234726 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k88tf"] Mar 19 09:49:50.057921 master-0 kubenswrapper[15202]: I0319 09:49:50.057774 15202 generic.go:334] "Generic (PLEG): container finished" podID="c5e93cf3-44d7-4fd9-bda9-f761362a00b8" containerID="71bebb15c1a6dd7e020d7a54eff7e872970a7e0e20a5ff73cf24ccf5c304d482" exitCode=0 Mar 19 09:49:50.058500 master-0 kubenswrapper[15202]: I0319 09:49:50.057867 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k88tf" event={"ID":"c5e93cf3-44d7-4fd9-bda9-f761362a00b8","Type":"ContainerDied","Data":"71bebb15c1a6dd7e020d7a54eff7e872970a7e0e20a5ff73cf24ccf5c304d482"} Mar 19 09:49:50.058500 master-0 kubenswrapper[15202]: I0319 09:49:50.057969 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k88tf" event={"ID":"c5e93cf3-44d7-4fd9-bda9-f761362a00b8","Type":"ContainerStarted","Data":"8cda252a44ebb4d9dc357da3f472e1af87172739edccb972dfa49a7bdaedc0e6"} Mar 19 09:49:50.727456 master-0 kubenswrapper[15202]: I0319 09:49:50.727397 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:50.738360 master-0 kubenswrapper[15202]: I0319 09:49:50.738284 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:50.839033 master-0 kubenswrapper[15202]: I0319 09:49:50.838973 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tldhf\" (UniqueName: \"kubernetes.io/projected/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-kube-api-access-tldhf\") pod \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " Mar 19 09:49:50.839221 master-0 kubenswrapper[15202]: I0319 09:49:50.839204 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h22h\" (UniqueName: \"kubernetes.io/projected/db9a43e6-7b74-426b-83ee-50df7a0270e7-kube-api-access-9h22h\") pod \"db9a43e6-7b74-426b-83ee-50df7a0270e7\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " Mar 19 09:49:50.839671 master-0 kubenswrapper[15202]: I0319 09:49:50.839266 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-operator-scripts\") pod \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\" (UID: \"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817\") " Mar 19 09:49:50.839671 master-0 kubenswrapper[15202]: I0319 09:49:50.839358 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db9a43e6-7b74-426b-83ee-50df7a0270e7-operator-scripts\") pod \"db9a43e6-7b74-426b-83ee-50df7a0270e7\" (UID: \"db9a43e6-7b74-426b-83ee-50df7a0270e7\") " Mar 19 09:49:50.839853 master-0 kubenswrapper[15202]: I0319 09:49:50.839804 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" (UID: "ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:50.839892 master-0 kubenswrapper[15202]: I0319 09:49:50.839851 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db9a43e6-7b74-426b-83ee-50df7a0270e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db9a43e6-7b74-426b-83ee-50df7a0270e7" (UID: "db9a43e6-7b74-426b-83ee-50df7a0270e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:50.840365 master-0 kubenswrapper[15202]: I0319 09:49:50.840335 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:50.840420 master-0 kubenswrapper[15202]: I0319 09:49:50.840364 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db9a43e6-7b74-426b-83ee-50df7a0270e7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:50.842926 master-0 kubenswrapper[15202]: I0319 09:49:50.842857 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db9a43e6-7b74-426b-83ee-50df7a0270e7-kube-api-access-9h22h" (OuterVolumeSpecName: "kube-api-access-9h22h") pod "db9a43e6-7b74-426b-83ee-50df7a0270e7" (UID: "db9a43e6-7b74-426b-83ee-50df7a0270e7"). InnerVolumeSpecName "kube-api-access-9h22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:50.855078 master-0 kubenswrapper[15202]: I0319 09:49:50.855008 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-kube-api-access-tldhf" (OuterVolumeSpecName: "kube-api-access-tldhf") pod "ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" (UID: "ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817"). InnerVolumeSpecName "kube-api-access-tldhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:50.942392 master-0 kubenswrapper[15202]: I0319 09:49:50.942341 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h22h\" (UniqueName: \"kubernetes.io/projected/db9a43e6-7b74-426b-83ee-50df7a0270e7-kube-api-access-9h22h\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:50.942670 master-0 kubenswrapper[15202]: I0319 09:49:50.942657 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tldhf\" (UniqueName: \"kubernetes.io/projected/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817-kube-api-access-tldhf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:51.075034 master-0 kubenswrapper[15202]: I0319 09:49:51.074933 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nj4vf" event={"ID":"db9a43e6-7b74-426b-83ee-50df7a0270e7","Type":"ContainerDied","Data":"4744293276cb211e17492c2d3dd4d7c90d99a8a780a92191569f090a5e39af7b"} Mar 19 09:49:51.076089 master-0 kubenswrapper[15202]: I0319 09:49:51.076063 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4744293276cb211e17492c2d3dd4d7c90d99a8a780a92191569f090a5e39af7b" Mar 19 09:49:51.076328 master-0 kubenswrapper[15202]: I0319 09:49:51.076023 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nj4vf" Mar 19 09:49:51.079373 master-0 kubenswrapper[15202]: I0319 09:49:51.079303 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fc3e-account-create-update-btzjb" Mar 19 09:49:51.079873 master-0 kubenswrapper[15202]: I0319 09:49:51.079611 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fc3e-account-create-update-btzjb" event={"ID":"ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817","Type":"ContainerDied","Data":"200c99c05c35d3fa2654de6fdaba38130f82c8aa6f971c320df2543d3b15cbc4"} Mar 19 09:49:51.079873 master-0 kubenswrapper[15202]: I0319 09:49:51.079664 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="200c99c05c35d3fa2654de6fdaba38130f82c8aa6f971c320df2543d3b15cbc4" Mar 19 09:49:51.230852 master-0 kubenswrapper[15202]: I0319 09:49:51.230589 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:49:51.328594 master-0 kubenswrapper[15202]: I0319 09:49:51.321986 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4"] Mar 19 09:49:51.328594 master-0 kubenswrapper[15202]: I0319 09:49:51.322296 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerName="dnsmasq-dns" containerID="cri-o://23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001" gracePeriod=10 Mar 19 09:49:51.683357 master-0 kubenswrapper[15202]: I0319 09:49:51.683064 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:51.881239 master-0 kubenswrapper[15202]: I0319 09:49:51.880603 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j24qx\" (UniqueName: \"kubernetes.io/projected/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-kube-api-access-j24qx\") pod \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " Mar 19 09:49:51.881239 master-0 kubenswrapper[15202]: I0319 09:49:51.880697 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-operator-scripts\") pod \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\" (UID: \"c5e93cf3-44d7-4fd9-bda9-f761362a00b8\") " Mar 19 09:49:51.882616 master-0 kubenswrapper[15202]: I0319 09:49:51.881983 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5e93cf3-44d7-4fd9-bda9-f761362a00b8" (UID: "c5e93cf3-44d7-4fd9-bda9-f761362a00b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:51.888776 master-0 kubenswrapper[15202]: I0319 09:49:51.888682 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-kube-api-access-j24qx" (OuterVolumeSpecName: "kube-api-access-j24qx") pod "c5e93cf3-44d7-4fd9-bda9-f761362a00b8" (UID: "c5e93cf3-44d7-4fd9-bda9-f761362a00b8"). InnerVolumeSpecName "kube-api-access-j24qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:51.984068 master-0 kubenswrapper[15202]: I0319 09:49:51.983907 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j24qx\" (UniqueName: \"kubernetes.io/projected/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-kube-api-access-j24qx\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:51.984068 master-0 kubenswrapper[15202]: I0319 09:49:51.984011 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e93cf3-44d7-4fd9-bda9-f761362a00b8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:52.088301 master-0 kubenswrapper[15202]: I0319 09:49:52.087456 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:49:52.092955 master-0 kubenswrapper[15202]: I0319 09:49:52.092897 15202 generic.go:334] "Generic (PLEG): container finished" podID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerID="23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001" exitCode=0 Mar 19 09:49:52.093356 master-0 kubenswrapper[15202]: I0319 09:49:52.092980 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" event={"ID":"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7","Type":"ContainerDied","Data":"23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001"} Mar 19 09:49:52.093356 master-0 kubenswrapper[15202]: I0319 09:49:52.093231 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" event={"ID":"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7","Type":"ContainerDied","Data":"251d4d1ac0805629bd519260b46532693cf3c202cc96f928e2dbb3873446eee7"} Mar 19 09:49:52.093356 master-0 kubenswrapper[15202]: I0319 09:49:52.093259 15202 scope.go:117] "RemoveContainer" containerID="23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001" Mar 19 09:49:52.093356 master-0 kubenswrapper[15202]: I0319 09:49:52.093046 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4" Mar 19 09:49:52.099698 master-0 kubenswrapper[15202]: I0319 09:49:52.099040 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k88tf" event={"ID":"c5e93cf3-44d7-4fd9-bda9-f761362a00b8","Type":"ContainerDied","Data":"8cda252a44ebb4d9dc357da3f472e1af87172739edccb972dfa49a7bdaedc0e6"} Mar 19 09:49:52.099698 master-0 kubenswrapper[15202]: I0319 09:49:52.099093 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cda252a44ebb4d9dc357da3f472e1af87172739edccb972dfa49a7bdaedc0e6" Mar 19 09:49:52.099698 master-0 kubenswrapper[15202]: I0319 09:49:52.099174 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k88tf" Mar 19 09:49:52.135554 master-0 kubenswrapper[15202]: I0319 09:49:52.135519 15202 scope.go:117] "RemoveContainer" containerID="483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.199803 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zxw2c"] Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: E0319 09:49:52.200377 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" containerName="mariadb-account-create-update" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200393 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" containerName="mariadb-account-create-update" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: E0319 09:49:52.200434 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e93cf3-44d7-4fd9-bda9-f761362a00b8" containerName="mariadb-account-create-update" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200441 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e93cf3-44d7-4fd9-bda9-f761362a00b8" containerName="mariadb-account-create-update" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: E0319 09:49:52.200457 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerName="dnsmasq-dns" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200475 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerName="dnsmasq-dns" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: E0319 09:49:52.200485 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerName="init" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200491 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerName="init" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: E0319 09:49:52.200504 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db9a43e6-7b74-426b-83ee-50df7a0270e7" containerName="mariadb-database-create" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200511 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="db9a43e6-7b74-426b-83ee-50df7a0270e7" containerName="mariadb-database-create" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200716 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="db9a43e6-7b74-426b-83ee-50df7a0270e7" containerName="mariadb-database-create" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200728 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" containerName="mariadb-account-create-update" Mar 19 09:49:52.202962 master-0 kubenswrapper[15202]: I0319 09:49:52.200746 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" containerName="dnsmasq-dns" Mar 19 09:49:52.206240 master-0 kubenswrapper[15202]: I0319 09:49:52.206157 15202 scope.go:117] "RemoveContainer" containerID="23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001" Mar 19 09:49:52.206862 master-0 kubenswrapper[15202]: E0319 09:49:52.206835 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001\": container with ID starting with 23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001 not found: ID does not exist" containerID="23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001" Mar 19 09:49:52.206931 master-0 kubenswrapper[15202]: I0319 09:49:52.206873 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001"} err="failed to get container status \"23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001\": rpc error: code = NotFound desc = could not find container \"23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001\": container with ID starting with 23754dc1c38517ec79a371ee06b98f46cd122c8db3cb800ecee4ac5ae19ad001 not found: ID does not exist" Mar 19 09:49:52.206931 master-0 kubenswrapper[15202]: I0319 09:49:52.206897 15202 scope.go:117] "RemoveContainer" containerID="483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb" Mar 19 09:49:52.207233 master-0 kubenswrapper[15202]: E0319 09:49:52.207184 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb\": container with ID starting with 483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb not found: ID does not exist" containerID="483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb" Mar 19 09:49:52.207270 master-0 kubenswrapper[15202]: I0319 09:49:52.207233 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb"} err="failed to get container status \"483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb\": rpc error: code = NotFound desc = could not find container \"483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb\": container with ID starting with 483ae210c50aadaeceb509202f09cb994c8bf2e3154b45477ffa439b0da52bcb not found: ID does not exist" Mar 19 09:49:52.215023 master-0 kubenswrapper[15202]: I0319 09:49:52.209887 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e93cf3-44d7-4fd9-bda9-f761362a00b8" containerName="mariadb-account-create-update" Mar 19 09:49:52.215023 master-0 kubenswrapper[15202]: I0319 09:49:52.211037 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zxw2c"] Mar 19 09:49:52.215023 master-0 kubenswrapper[15202]: I0319 09:49:52.211160 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.215023 master-0 kubenswrapper[15202]: I0319 09:49:52.214114 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-config-data" Mar 19 09:49:52.297375 master-0 kubenswrapper[15202]: I0319 09:49:52.297301 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmscr\" (UniqueName: \"kubernetes.io/projected/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-kube-api-access-lmscr\") pod \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " Mar 19 09:49:52.298691 master-0 kubenswrapper[15202]: I0319 09:49:52.298645 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-dns-svc\") pod \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " Mar 19 09:49:52.298852 master-0 kubenswrapper[15202]: I0319 09:49:52.298820 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-config\") pod \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\" (UID: \"e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7\") " Mar 19 09:49:52.306131 master-0 kubenswrapper[15202]: I0319 09:49:52.306046 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-kube-api-access-lmscr" (OuterVolumeSpecName: "kube-api-access-lmscr") pod "e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" (UID: "e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7"). InnerVolumeSpecName "kube-api-access-lmscr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:52.344419 master-0 kubenswrapper[15202]: I0319 09:49:52.344360 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" (UID: "e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:52.355372 master-0 kubenswrapper[15202]: I0319 09:49:52.355314 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-config" (OuterVolumeSpecName: "config") pod "e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" (UID: "e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:52.404444 master-0 kubenswrapper[15202]: I0319 09:49:52.404281 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-db-sync-config-data\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.404444 master-0 kubenswrapper[15202]: I0319 09:49:52.404409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-combined-ca-bundle\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.404814 master-0 kubenswrapper[15202]: I0319 09:49:52.404538 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pbvc\" (UniqueName: \"kubernetes.io/projected/c9bce886-0af6-432e-ab68-b4af30a4defd-kube-api-access-9pbvc\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.404814 master-0 kubenswrapper[15202]: I0319 09:49:52.404642 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-config-data\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.404814 master-0 kubenswrapper[15202]: I0319 09:49:52.404755 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmscr\" (UniqueName: \"kubernetes.io/projected/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-kube-api-access-lmscr\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:52.404814 master-0 kubenswrapper[15202]: I0319 09:49:52.404767 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:52.404814 master-0 kubenswrapper[15202]: I0319 09:49:52.404777 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:52.438996 master-0 kubenswrapper[15202]: I0319 09:49:52.438915 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4"] Mar 19 09:49:52.448103 master-0 kubenswrapper[15202]: I0319 09:49:52.448028 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-qk9z4"] Mar 19 09:49:52.506924 master-0 kubenswrapper[15202]: I0319 09:49:52.506811 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-config-data\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.507265 master-0 kubenswrapper[15202]: I0319 09:49:52.506989 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-db-sync-config-data\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.507265 master-0 kubenswrapper[15202]: I0319 09:49:52.507023 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-combined-ca-bundle\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.507265 master-0 kubenswrapper[15202]: I0319 09:49:52.507083 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pbvc\" (UniqueName: \"kubernetes.io/projected/c9bce886-0af6-432e-ab68-b4af30a4defd-kube-api-access-9pbvc\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.512805 master-0 kubenswrapper[15202]: I0319 09:49:52.512717 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-db-sync-config-data\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.512805 master-0 kubenswrapper[15202]: I0319 09:49:52.512773 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-combined-ca-bundle\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.514229 master-0 kubenswrapper[15202]: I0319 09:49:52.514191 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-config-data\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.535491 master-0 kubenswrapper[15202]: I0319 09:49:52.533364 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pbvc\" (UniqueName: \"kubernetes.io/projected/c9bce886-0af6-432e-ab68-b4af30a4defd-kube-api-access-9pbvc\") pod \"glance-db-sync-zxw2c\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.553491 master-0 kubenswrapper[15202]: I0319 09:49:52.553183 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wpbkz"] Mar 19 09:49:52.555287 master-0 kubenswrapper[15202]: I0319 09:49:52.554667 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.555798 master-0 kubenswrapper[15202]: I0319 09:49:52.555724 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zxw2c" Mar 19 09:49:52.595661 master-0 kubenswrapper[15202]: I0319 09:49:52.594132 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wpbkz"] Mar 19 09:49:52.615360 master-0 kubenswrapper[15202]: I0319 09:49:52.615297 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68cac917-044f-4346-81f4-336f070d2557-operator-scripts\") pod \"keystone-db-create-wpbkz\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.615660 master-0 kubenswrapper[15202]: I0319 09:49:52.615450 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwnn\" (UniqueName: \"kubernetes.io/projected/68cac917-044f-4346-81f4-336f070d2557-kube-api-access-zvwnn\") pod \"keystone-db-create-wpbkz\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.687748 master-0 kubenswrapper[15202]: I0319 09:49:52.687610 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-16fb-account-create-update-8cp5c"] Mar 19 09:49:52.719528 master-0 kubenswrapper[15202]: I0319 09:49:52.710073 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-16fb-account-create-update-8cp5c"] Mar 19 09:49:52.719528 master-0 kubenswrapper[15202]: I0319 09:49:52.710225 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.719528 master-0 kubenswrapper[15202]: I0319 09:49:52.714763 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 09:49:52.730521 master-0 kubenswrapper[15202]: I0319 09:49:52.730293 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwnn\" (UniqueName: \"kubernetes.io/projected/68cac917-044f-4346-81f4-336f070d2557-kube-api-access-zvwnn\") pod \"keystone-db-create-wpbkz\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.730521 master-0 kubenswrapper[15202]: I0319 09:49:52.730481 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnr4\" (UniqueName: \"kubernetes.io/projected/4b34559e-c4bb-497f-9659-c0179f8010be-kube-api-access-vrnr4\") pod \"keystone-16fb-account-create-update-8cp5c\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.730747 master-0 kubenswrapper[15202]: I0319 09:49:52.730652 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b34559e-c4bb-497f-9659-c0179f8010be-operator-scripts\") pod \"keystone-16fb-account-create-update-8cp5c\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.731124 master-0 kubenswrapper[15202]: I0319 09:49:52.730835 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68cac917-044f-4346-81f4-336f070d2557-operator-scripts\") pod \"keystone-db-create-wpbkz\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.737514 master-0 kubenswrapper[15202]: I0319 09:49:52.737193 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68cac917-044f-4346-81f4-336f070d2557-operator-scripts\") pod \"keystone-db-create-wpbkz\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.774762 master-0 kubenswrapper[15202]: I0319 09:49:52.774712 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwnn\" (UniqueName: \"kubernetes.io/projected/68cac917-044f-4346-81f4-336f070d2557-kube-api-access-zvwnn\") pod \"keystone-db-create-wpbkz\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:52.850894 master-0 kubenswrapper[15202]: I0319 09:49:52.850129 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b34559e-c4bb-497f-9659-c0179f8010be-operator-scripts\") pod \"keystone-16fb-account-create-update-8cp5c\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.850894 master-0 kubenswrapper[15202]: I0319 09:49:52.850406 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnr4\" (UniqueName: \"kubernetes.io/projected/4b34559e-c4bb-497f-9659-c0179f8010be-kube-api-access-vrnr4\") pod \"keystone-16fb-account-create-update-8cp5c\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.853613 master-0 kubenswrapper[15202]: I0319 09:49:52.852646 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b34559e-c4bb-497f-9659-c0179f8010be-operator-scripts\") pod \"keystone-16fb-account-create-update-8cp5c\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.866480 master-0 kubenswrapper[15202]: I0319 09:49:52.866409 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7" path="/var/lib/kubelet/pods/e943b4f9-bf1c-4b0d-98a7-0ce26e2256e7/volumes" Mar 19 09:49:52.868739 master-0 kubenswrapper[15202]: I0319 09:49:52.868705 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnr4\" (UniqueName: \"kubernetes.io/projected/4b34559e-c4bb-497f-9659-c0179f8010be-kube-api-access-vrnr4\") pod \"keystone-16fb-account-create-update-8cp5c\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:52.877630 master-0 kubenswrapper[15202]: I0319 09:49:52.877585 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5bqq7"] Mar 19 09:49:52.879201 master-0 kubenswrapper[15202]: I0319 09:49:52.879173 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:52.885176 master-0 kubenswrapper[15202]: I0319 09:49:52.885113 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5bqq7"] Mar 19 09:49:52.956096 master-0 kubenswrapper[15202]: I0319 09:49:52.952767 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p7nr\" (UniqueName: \"kubernetes.io/projected/b6308b84-19d0-444a-9ef6-6ab67792a0c5-kube-api-access-8p7nr\") pod \"placement-db-create-5bqq7\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:52.965531 master-0 kubenswrapper[15202]: I0319 09:49:52.962128 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6308b84-19d0-444a-9ef6-6ab67792a0c5-operator-scripts\") pod \"placement-db-create-5bqq7\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:52.965531 master-0 kubenswrapper[15202]: I0319 09:49:52.955673 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f95e-account-create-update-gph65"] Mar 19 09:49:52.965531 master-0 kubenswrapper[15202]: I0319 09:49:52.965160 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:52.967876 master-0 kubenswrapper[15202]: I0319 09:49:52.967155 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f95e-account-create-update-gph65"] Mar 19 09:49:52.972613 master-0 kubenswrapper[15202]: I0319 09:49:52.971186 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 09:49:52.988119 master-0 kubenswrapper[15202]: I0319 09:49:52.988059 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:53.056289 master-0 kubenswrapper[15202]: I0319 09:49:53.056210 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:53.064304 master-0 kubenswrapper[15202]: I0319 09:49:53.064234 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4prhd\" (UniqueName: \"kubernetes.io/projected/b19f30b4-71bc-486c-9fa7-306c8dac619b-kube-api-access-4prhd\") pod \"placement-f95e-account-create-update-gph65\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:53.064535 master-0 kubenswrapper[15202]: I0319 09:49:53.064453 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6308b84-19d0-444a-9ef6-6ab67792a0c5-operator-scripts\") pod \"placement-db-create-5bqq7\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:53.064682 master-0 kubenswrapper[15202]: I0319 09:49:53.064655 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p7nr\" (UniqueName: \"kubernetes.io/projected/b6308b84-19d0-444a-9ef6-6ab67792a0c5-kube-api-access-8p7nr\") pod \"placement-db-create-5bqq7\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:53.064754 master-0 kubenswrapper[15202]: I0319 09:49:53.064713 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b19f30b4-71bc-486c-9fa7-306c8dac619b-operator-scripts\") pod \"placement-f95e-account-create-update-gph65\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:53.065300 master-0 kubenswrapper[15202]: I0319 09:49:53.065269 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6308b84-19d0-444a-9ef6-6ab67792a0c5-operator-scripts\") pod \"placement-db-create-5bqq7\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:53.091504 master-0 kubenswrapper[15202]: I0319 09:49:53.090989 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p7nr\" (UniqueName: \"kubernetes.io/projected/b6308b84-19d0-444a-9ef6-6ab67792a0c5-kube-api-access-8p7nr\") pod \"placement-db-create-5bqq7\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:53.173514 master-0 kubenswrapper[15202]: I0319 09:49:53.173307 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4prhd\" (UniqueName: \"kubernetes.io/projected/b19f30b4-71bc-486c-9fa7-306c8dac619b-kube-api-access-4prhd\") pod \"placement-f95e-account-create-update-gph65\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:53.173514 master-0 kubenswrapper[15202]: I0319 09:49:53.173448 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b19f30b4-71bc-486c-9fa7-306c8dac619b-operator-scripts\") pod \"placement-f95e-account-create-update-gph65\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:53.176693 master-0 kubenswrapper[15202]: I0319 09:49:53.176660 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b19f30b4-71bc-486c-9fa7-306c8dac619b-operator-scripts\") pod \"placement-f95e-account-create-update-gph65\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:53.257510 master-0 kubenswrapper[15202]: I0319 09:49:53.257426 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:53.290317 master-0 kubenswrapper[15202]: I0319 09:49:53.290251 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4prhd\" (UniqueName: \"kubernetes.io/projected/b19f30b4-71bc-486c-9fa7-306c8dac619b-kube-api-access-4prhd\") pod \"placement-f95e-account-create-update-gph65\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:53.315846 master-0 kubenswrapper[15202]: I0319 09:49:53.315784 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:54.940608 master-0 kubenswrapper[15202]: I0319 09:49:54.932313 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:54.940608 master-0 kubenswrapper[15202]: I0319 09:49:54.939569 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d9c99748-0ca1-4e25-947a-801c2e8748f5-etc-swift\") pod \"swift-storage-0\" (UID: \"d9c99748-0ca1-4e25-947a-801c2e8748f5\") " pod="openstack/swift-storage-0" Mar 19 09:49:55.038376 master-0 kubenswrapper[15202]: I0319 09:49:55.038314 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 09:49:55.552999 master-0 kubenswrapper[15202]: I0319 09:49:55.552788 15202 generic.go:334] "Generic (PLEG): container finished" podID="e8372380-9188-4c0f-9e75-05739d26a27c" containerID="c8843573ed1aa244dac508e291e031e81054ef65f66461b2b12bc4e1f57fed52" exitCode=0 Mar 19 09:49:55.558225 master-0 kubenswrapper[15202]: I0319 09:49:55.558157 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l8hw9" event={"ID":"e8372380-9188-4c0f-9e75-05739d26a27c","Type":"ContainerDied","Data":"c8843573ed1aa244dac508e291e031e81054ef65f66461b2b12bc4e1f57fed52"} Mar 19 09:49:55.623667 master-0 kubenswrapper[15202]: W0319 09:49:55.623621 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b34559e_c4bb_497f_9659_c0179f8010be.slice/crio-3006fcd8d2ed63911461dffd4fcb1cd4e3ee959569f43c0fd201bd34da1f7b47 WatchSource:0}: Error finding container 3006fcd8d2ed63911461dffd4fcb1cd4e3ee959569f43c0fd201bd34da1f7b47: Status 404 returned error can't find the container with id 3006fcd8d2ed63911461dffd4fcb1cd4e3ee959569f43c0fd201bd34da1f7b47 Mar 19 09:49:55.625335 master-0 kubenswrapper[15202]: I0319 09:49:55.625293 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wpbkz"] Mar 19 09:49:55.631085 master-0 kubenswrapper[15202]: W0319 09:49:55.630401 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68cac917_044f_4346_81f4_336f070d2557.slice/crio-211b06b231fd71afd6151fe207be386ae219d9beb2677c9581f9b23eb5c89dde WatchSource:0}: Error finding container 211b06b231fd71afd6151fe207be386ae219d9beb2677c9581f9b23eb5c89dde: Status 404 returned error can't find the container with id 211b06b231fd71afd6151fe207be386ae219d9beb2677c9581f9b23eb5c89dde Mar 19 09:49:55.644797 master-0 kubenswrapper[15202]: I0319 09:49:55.644731 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-16fb-account-create-update-8cp5c"] Mar 19 09:49:55.726429 master-0 kubenswrapper[15202]: I0319 09:49:55.725493 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zxw2c"] Mar 19 09:49:55.750957 master-0 kubenswrapper[15202]: W0319 09:49:55.750905 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9bce886_0af6_432e_ab68_b4af30a4defd.slice/crio-d94ffaca0ede5ff4e2b1d05f0bc3779f014ad819344d687838eb6f2a70e75027 WatchSource:0}: Error finding container d94ffaca0ede5ff4e2b1d05f0bc3779f014ad819344d687838eb6f2a70e75027: Status 404 returned error can't find the container with id d94ffaca0ede5ff4e2b1d05f0bc3779f014ad819344d687838eb6f2a70e75027 Mar 19 09:49:55.778402 master-0 kubenswrapper[15202]: I0319 09:49:55.778214 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f95e-account-create-update-gph65"] Mar 19 09:49:55.789174 master-0 kubenswrapper[15202]: I0319 09:49:55.789085 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5bqq7"] Mar 19 09:49:55.808661 master-0 kubenswrapper[15202]: I0319 09:49:55.808382 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k88tf"] Mar 19 09:49:55.840725 master-0 kubenswrapper[15202]: I0319 09:49:55.833273 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k88tf"] Mar 19 09:49:56.252793 master-0 kubenswrapper[15202]: I0319 09:49:56.247878 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 09:49:56.566851 master-0 kubenswrapper[15202]: I0319 09:49:56.566744 15202 generic.go:334] "Generic (PLEG): container finished" podID="b19f30b4-71bc-486c-9fa7-306c8dac619b" containerID="4576df9849ff0c12cbc12baac98225d3548ba664159881c63610cbe608e13714" exitCode=0 Mar 19 09:49:56.567171 master-0 kubenswrapper[15202]: I0319 09:49:56.566878 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f95e-account-create-update-gph65" event={"ID":"b19f30b4-71bc-486c-9fa7-306c8dac619b","Type":"ContainerDied","Data":"4576df9849ff0c12cbc12baac98225d3548ba664159881c63610cbe608e13714"} Mar 19 09:49:56.567171 master-0 kubenswrapper[15202]: I0319 09:49:56.566941 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f95e-account-create-update-gph65" event={"ID":"b19f30b4-71bc-486c-9fa7-306c8dac619b","Type":"ContainerStarted","Data":"72c00884b0069a66aa40882cda4b976331514d0538731b84179357a973206b86"} Mar 19 09:49:56.570566 master-0 kubenswrapper[15202]: I0319 09:49:56.570173 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"3376654d7e8cae820e0233555862813adb3b3e21ffc4079a443d9ee4b3dd3eab"} Mar 19 09:49:56.577015 master-0 kubenswrapper[15202]: I0319 09:49:56.576954 15202 generic.go:334] "Generic (PLEG): container finished" podID="68cac917-044f-4346-81f4-336f070d2557" containerID="30e76772395e2cd00d44891ebcc7926f70b32eb387104ca51f9cd09acf540dcc" exitCode=0 Mar 19 09:49:56.577015 master-0 kubenswrapper[15202]: I0319 09:49:56.577019 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wpbkz" event={"ID":"68cac917-044f-4346-81f4-336f070d2557","Type":"ContainerDied","Data":"30e76772395e2cd00d44891ebcc7926f70b32eb387104ca51f9cd09acf540dcc"} Mar 19 09:49:56.577521 master-0 kubenswrapper[15202]: I0319 09:49:56.577041 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wpbkz" event={"ID":"68cac917-044f-4346-81f4-336f070d2557","Type":"ContainerStarted","Data":"211b06b231fd71afd6151fe207be386ae219d9beb2677c9581f9b23eb5c89dde"} Mar 19 09:49:56.579613 master-0 kubenswrapper[15202]: I0319 09:49:56.579572 15202 generic.go:334] "Generic (PLEG): container finished" podID="9bab9d65-06f1-4b08-aa8c-5f12e7d06183" containerID="ebec7d98a3d83cb31d22f9e455d5d4152c2b8a16ffdd6185b239bdfe6662a137" exitCode=0 Mar 19 09:49:56.579613 master-0 kubenswrapper[15202]: I0319 09:49:56.579614 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9bab9d65-06f1-4b08-aa8c-5f12e7d06183","Type":"ContainerDied","Data":"ebec7d98a3d83cb31d22f9e455d5d4152c2b8a16ffdd6185b239bdfe6662a137"} Mar 19 09:49:56.592344 master-0 kubenswrapper[15202]: I0319 09:49:56.587582 15202 generic.go:334] "Generic (PLEG): container finished" podID="4b34559e-c4bb-497f-9659-c0179f8010be" containerID="bf9280325357452e6fda657e1d35b0337731a76ba60bf9ce428f7c001f2e72d5" exitCode=0 Mar 19 09:49:56.592344 master-0 kubenswrapper[15202]: I0319 09:49:56.587679 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16fb-account-create-update-8cp5c" event={"ID":"4b34559e-c4bb-497f-9659-c0179f8010be","Type":"ContainerDied","Data":"bf9280325357452e6fda657e1d35b0337731a76ba60bf9ce428f7c001f2e72d5"} Mar 19 09:49:56.592344 master-0 kubenswrapper[15202]: I0319 09:49:56.587713 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16fb-account-create-update-8cp5c" event={"ID":"4b34559e-c4bb-497f-9659-c0179f8010be","Type":"ContainerStarted","Data":"3006fcd8d2ed63911461dffd4fcb1cd4e3ee959569f43c0fd201bd34da1f7b47"} Mar 19 09:49:56.592859 master-0 kubenswrapper[15202]: I0319 09:49:56.592683 15202 generic.go:334] "Generic (PLEG): container finished" podID="b6308b84-19d0-444a-9ef6-6ab67792a0c5" containerID="23dc01441218bd4c719812d52d08d2e2a21ed5b494082120e597b96b806fb5ab" exitCode=0 Mar 19 09:49:56.592859 master-0 kubenswrapper[15202]: I0319 09:49:56.592755 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bqq7" event={"ID":"b6308b84-19d0-444a-9ef6-6ab67792a0c5","Type":"ContainerDied","Data":"23dc01441218bd4c719812d52d08d2e2a21ed5b494082120e597b96b806fb5ab"} Mar 19 09:49:56.592859 master-0 kubenswrapper[15202]: I0319 09:49:56.592783 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bqq7" event={"ID":"b6308b84-19d0-444a-9ef6-6ab67792a0c5","Type":"ContainerStarted","Data":"aa89dca39cf164bb7cba3be9a15ee73fa7342c4963dc2c9814a748d9c42a2bb1"} Mar 19 09:49:56.598527 master-0 kubenswrapper[15202]: I0319 09:49:56.596322 15202 generic.go:334] "Generic (PLEG): container finished" podID="67fbe9e8-1121-4091-954c-c6a620d98528" containerID="e1a908256ed8d079197cd9a0453f42cc838e68649df1a123e8b31c033bc7d0a4" exitCode=0 Mar 19 09:49:56.598527 master-0 kubenswrapper[15202]: I0319 09:49:56.596400 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67fbe9e8-1121-4091-954c-c6a620d98528","Type":"ContainerDied","Data":"e1a908256ed8d079197cd9a0453f42cc838e68649df1a123e8b31c033bc7d0a4"} Mar 19 09:49:56.603905 master-0 kubenswrapper[15202]: I0319 09:49:56.603778 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zxw2c" event={"ID":"c9bce886-0af6-432e-ab68-b4af30a4defd","Type":"ContainerStarted","Data":"d94ffaca0ede5ff4e2b1d05f0bc3779f014ad819344d687838eb6f2a70e75027"} Mar 19 09:49:56.828327 master-0 kubenswrapper[15202]: I0319 09:49:56.828273 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e93cf3-44d7-4fd9-bda9-f761362a00b8" path="/var/lib/kubelet/pods/c5e93cf3-44d7-4fd9-bda9-f761362a00b8/volumes" Mar 19 09:49:57.173263 master-0 kubenswrapper[15202]: I0319 09:49:57.173175 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:57.201952 master-0 kubenswrapper[15202]: I0319 09:49:57.201755 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-swiftconf\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.201952 master-0 kubenswrapper[15202]: I0319 09:49:57.201912 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8372380-9188-4c0f-9e75-05739d26a27c-etc-swift\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.202202 master-0 kubenswrapper[15202]: I0319 09:49:57.201994 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-combined-ca-bundle\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.202202 master-0 kubenswrapper[15202]: I0319 09:49:57.202081 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-ring-data-devices\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.202202 master-0 kubenswrapper[15202]: I0319 09:49:57.202151 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wns86\" (UniqueName: \"kubernetes.io/projected/e8372380-9188-4c0f-9e75-05739d26a27c-kube-api-access-wns86\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.202329 master-0 kubenswrapper[15202]: I0319 09:49:57.202260 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-dispersionconf\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.202329 master-0 kubenswrapper[15202]: I0319 09:49:57.202305 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-scripts\") pod \"e8372380-9188-4c0f-9e75-05739d26a27c\" (UID: \"e8372380-9188-4c0f-9e75-05739d26a27c\") " Mar 19 09:49:57.203330 master-0 kubenswrapper[15202]: I0319 09:49:57.203119 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8372380-9188-4c0f-9e75-05739d26a27c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:57.204642 master-0 kubenswrapper[15202]: I0319 09:49:57.204428 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:57.208326 master-0 kubenswrapper[15202]: I0319 09:49:57.208248 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8372380-9188-4c0f-9e75-05739d26a27c-kube-api-access-wns86" (OuterVolumeSpecName: "kube-api-access-wns86") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "kube-api-access-wns86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:57.227533 master-0 kubenswrapper[15202]: I0319 09:49:57.226093 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:57.232679 master-0 kubenswrapper[15202]: I0319 09:49:57.232627 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-scripts" (OuterVolumeSpecName: "scripts") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:57.247509 master-0 kubenswrapper[15202]: I0319 09:49:57.247411 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:57.267645 master-0 kubenswrapper[15202]: I0319 09:49:57.267242 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8372380-9188-4c0f-9e75-05739d26a27c" (UID: "e8372380-9188-4c0f-9e75-05739d26a27c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305150 15202 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305200 15202 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8372380-9188-4c0f-9e75-05739d26a27c-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305212 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305223 15202 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305234 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wns86\" (UniqueName: \"kubernetes.io/projected/e8372380-9188-4c0f-9e75-05739d26a27c-kube-api-access-wns86\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305242 15202 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8372380-9188-4c0f-9e75-05739d26a27c-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.305276 master-0 kubenswrapper[15202]: I0319 09:49:57.305253 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8372380-9188-4c0f-9e75-05739d26a27c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:57.615847 master-0 kubenswrapper[15202]: I0319 09:49:57.615706 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9bab9d65-06f1-4b08-aa8c-5f12e7d06183","Type":"ContainerStarted","Data":"71131758962f24d930b4dcab4c69bc5f80a98f2162200a07ed99e698b2ae26ec"} Mar 19 09:49:57.617321 master-0 kubenswrapper[15202]: I0319 09:49:57.617100 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:49:57.619138 master-0 kubenswrapper[15202]: I0319 09:49:57.619111 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67fbe9e8-1121-4091-954c-c6a620d98528","Type":"ContainerStarted","Data":"6a518ceb9dc1c26ab96cda760df82440dbc7550377bda6e165729fab875e44e8"} Mar 19 09:49:57.619681 master-0 kubenswrapper[15202]: I0319 09:49:57.619651 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 09:49:57.624931 master-0 kubenswrapper[15202]: I0319 09:49:57.624883 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-l8hw9" Mar 19 09:49:57.624931 master-0 kubenswrapper[15202]: I0319 09:49:57.624894 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-l8hw9" event={"ID":"e8372380-9188-4c0f-9e75-05739d26a27c","Type":"ContainerDied","Data":"7b57876d094fd8c72acec31ca3282dff43e16d701151d700bb90e1ca94fd2cff"} Mar 19 09:49:57.625078 master-0 kubenswrapper[15202]: I0319 09:49:57.624938 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b57876d094fd8c72acec31ca3282dff43e16d701151d700bb90e1ca94fd2cff" Mar 19 09:49:57.794241 master-0 kubenswrapper[15202]: I0319 09:49:57.794151 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.336016205 podStartE2EDuration="1m1.794127759s" podCreationTimestamp="2026-03-19 09:48:56 +0000 UTC" firstStartedPulling="2026-03-19 09:49:08.726798745 +0000 UTC m=+1466.112213561" lastFinishedPulling="2026-03-19 09:49:21.184910299 +0000 UTC m=+1478.570325115" observedRunningTime="2026-03-19 09:49:57.747695455 +0000 UTC m=+1515.133110271" watchObservedRunningTime="2026-03-19 09:49:57.794127759 +0000 UTC m=+1515.179542575" Mar 19 09:49:57.885117 master-0 kubenswrapper[15202]: I0319 09:49:57.884927 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=60.884904757 podStartE2EDuration="1m0.884904757s" podCreationTimestamp="2026-03-19 09:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:57.82135008 +0000 UTC m=+1515.206764906" watchObservedRunningTime="2026-03-19 09:49:57.884904757 +0000 UTC m=+1515.270319583" Mar 19 09:49:58.228420 master-0 kubenswrapper[15202]: I0319 09:49:58.226730 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:58.257360 master-0 kubenswrapper[15202]: I0319 09:49:58.257291 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b19f30b4-71bc-486c-9fa7-306c8dac619b-operator-scripts\") pod \"b19f30b4-71bc-486c-9fa7-306c8dac619b\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " Mar 19 09:49:58.257512 master-0 kubenswrapper[15202]: I0319 09:49:58.257377 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4prhd\" (UniqueName: \"kubernetes.io/projected/b19f30b4-71bc-486c-9fa7-306c8dac619b-kube-api-access-4prhd\") pod \"b19f30b4-71bc-486c-9fa7-306c8dac619b\" (UID: \"b19f30b4-71bc-486c-9fa7-306c8dac619b\") " Mar 19 09:49:58.257954 master-0 kubenswrapper[15202]: I0319 09:49:58.257909 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b19f30b4-71bc-486c-9fa7-306c8dac619b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b19f30b4-71bc-486c-9fa7-306c8dac619b" (UID: "b19f30b4-71bc-486c-9fa7-306c8dac619b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:58.258530 master-0 kubenswrapper[15202]: I0319 09:49:58.258507 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b19f30b4-71bc-486c-9fa7-306c8dac619b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.275773 master-0 kubenswrapper[15202]: I0319 09:49:58.275708 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19f30b4-71bc-486c-9fa7-306c8dac619b-kube-api-access-4prhd" (OuterVolumeSpecName: "kube-api-access-4prhd") pod "b19f30b4-71bc-486c-9fa7-306c8dac619b" (UID: "b19f30b4-71bc-486c-9fa7-306c8dac619b"). InnerVolumeSpecName "kube-api-access-4prhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:58.374002 master-0 kubenswrapper[15202]: I0319 09:49:58.373916 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4prhd\" (UniqueName: \"kubernetes.io/projected/b19f30b4-71bc-486c-9fa7-306c8dac619b-kube-api-access-4prhd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.657193 master-0 kubenswrapper[15202]: I0319 09:49:58.657131 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f95e-account-create-update-gph65" event={"ID":"b19f30b4-71bc-486c-9fa7-306c8dac619b","Type":"ContainerDied","Data":"72c00884b0069a66aa40882cda4b976331514d0538731b84179357a973206b86"} Mar 19 09:49:58.657193 master-0 kubenswrapper[15202]: I0319 09:49:58.657199 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72c00884b0069a66aa40882cda4b976331514d0538731b84179357a973206b86" Mar 19 09:49:58.657478 master-0 kubenswrapper[15202]: I0319 09:49:58.657295 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f95e-account-create-update-gph65" Mar 19 09:49:58.664445 master-0 kubenswrapper[15202]: I0319 09:49:58.664385 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:58.671484 master-0 kubenswrapper[15202]: I0319 09:49:58.668379 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"ffabb5163355114a0106d068feeb68692b2491f55285028064d4dab0b7135b0e"} Mar 19 09:49:58.671484 master-0 kubenswrapper[15202]: I0319 09:49:58.668419 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"1f0591933b4313de5281a7c239d9af03097fbbd96c46fed37385be74901e7583"} Mar 19 09:49:58.673560 master-0 kubenswrapper[15202]: I0319 09:49:58.671831 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bqq7" Mar 19 09:49:58.673560 master-0 kubenswrapper[15202]: I0319 09:49:58.671985 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bqq7" event={"ID":"b6308b84-19d0-444a-9ef6-6ab67792a0c5","Type":"ContainerDied","Data":"aa89dca39cf164bb7cba3be9a15ee73fa7342c4963dc2c9814a748d9c42a2bb1"} Mar 19 09:49:58.673560 master-0 kubenswrapper[15202]: I0319 09:49:58.672009 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa89dca39cf164bb7cba3be9a15ee73fa7342c4963dc2c9814a748d9c42a2bb1" Mar 19 09:49:58.688197 master-0 kubenswrapper[15202]: I0319 09:49:58.688146 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p7nr\" (UniqueName: \"kubernetes.io/projected/b6308b84-19d0-444a-9ef6-6ab67792a0c5-kube-api-access-8p7nr\") pod \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " Mar 19 09:49:58.688345 master-0 kubenswrapper[15202]: I0319 09:49:58.688268 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6308b84-19d0-444a-9ef6-6ab67792a0c5-operator-scripts\") pod \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\" (UID: \"b6308b84-19d0-444a-9ef6-6ab67792a0c5\") " Mar 19 09:49:58.697967 master-0 kubenswrapper[15202]: I0319 09:49:58.697898 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6308b84-19d0-444a-9ef6-6ab67792a0c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6308b84-19d0-444a-9ef6-6ab67792a0c5" (UID: "b6308b84-19d0-444a-9ef6-6ab67792a0c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:58.702640 master-0 kubenswrapper[15202]: I0319 09:49:58.702297 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:58.716406 master-0 kubenswrapper[15202]: I0319 09:49:58.715148 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6308b84-19d0-444a-9ef6-6ab67792a0c5-kube-api-access-8p7nr" (OuterVolumeSpecName: "kube-api-access-8p7nr") pod "b6308b84-19d0-444a-9ef6-6ab67792a0c5" (UID: "b6308b84-19d0-444a-9ef6-6ab67792a0c5"). InnerVolumeSpecName "kube-api-access-8p7nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:58.722279 master-0 kubenswrapper[15202]: I0319 09:49:58.722033 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:58.798604 master-0 kubenswrapper[15202]: I0319 09:49:58.790413 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p7nr\" (UniqueName: \"kubernetes.io/projected/b6308b84-19d0-444a-9ef6-6ab67792a0c5-kube-api-access-8p7nr\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.798604 master-0 kubenswrapper[15202]: I0319 09:49:58.790451 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6308b84-19d0-444a-9ef6-6ab67792a0c5-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.894692 master-0 kubenswrapper[15202]: I0319 09:49:58.892458 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvwnn\" (UniqueName: \"kubernetes.io/projected/68cac917-044f-4346-81f4-336f070d2557-kube-api-access-zvwnn\") pod \"68cac917-044f-4346-81f4-336f070d2557\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " Mar 19 09:49:58.894692 master-0 kubenswrapper[15202]: I0319 09:49:58.892837 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68cac917-044f-4346-81f4-336f070d2557-operator-scripts\") pod \"68cac917-044f-4346-81f4-336f070d2557\" (UID: \"68cac917-044f-4346-81f4-336f070d2557\") " Mar 19 09:49:58.894692 master-0 kubenswrapper[15202]: I0319 09:49:58.892925 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b34559e-c4bb-497f-9659-c0179f8010be-operator-scripts\") pod \"4b34559e-c4bb-497f-9659-c0179f8010be\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " Mar 19 09:49:58.894692 master-0 kubenswrapper[15202]: I0319 09:49:58.893203 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrnr4\" (UniqueName: \"kubernetes.io/projected/4b34559e-c4bb-497f-9659-c0179f8010be-kube-api-access-vrnr4\") pod \"4b34559e-c4bb-497f-9659-c0179f8010be\" (UID: \"4b34559e-c4bb-497f-9659-c0179f8010be\") " Mar 19 09:49:58.896274 master-0 kubenswrapper[15202]: I0319 09:49:58.895890 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68cac917-044f-4346-81f4-336f070d2557-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68cac917-044f-4346-81f4-336f070d2557" (UID: "68cac917-044f-4346-81f4-336f070d2557"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:58.897332 master-0 kubenswrapper[15202]: I0319 09:49:58.897270 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b34559e-c4bb-497f-9659-c0179f8010be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b34559e-c4bb-497f-9659-c0179f8010be" (UID: "4b34559e-c4bb-497f-9659-c0179f8010be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:58.898813 master-0 kubenswrapper[15202]: I0319 09:49:58.898773 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b34559e-c4bb-497f-9659-c0179f8010be-kube-api-access-vrnr4" (OuterVolumeSpecName: "kube-api-access-vrnr4") pod "4b34559e-c4bb-497f-9659-c0179f8010be" (UID: "4b34559e-c4bb-497f-9659-c0179f8010be"). InnerVolumeSpecName "kube-api-access-vrnr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:58.902462 master-0 kubenswrapper[15202]: I0319 09:49:58.902406 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cac917-044f-4346-81f4-336f070d2557-kube-api-access-zvwnn" (OuterVolumeSpecName: "kube-api-access-zvwnn") pod "68cac917-044f-4346-81f4-336f070d2557" (UID: "68cac917-044f-4346-81f4-336f070d2557"). InnerVolumeSpecName "kube-api-access-zvwnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:58.998880 master-0 kubenswrapper[15202]: I0319 09:49:58.998805 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvwnn\" (UniqueName: \"kubernetes.io/projected/68cac917-044f-4346-81f4-336f070d2557-kube-api-access-zvwnn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.998880 master-0 kubenswrapper[15202]: I0319 09:49:58.998881 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68cac917-044f-4346-81f4-336f070d2557-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.999081 master-0 kubenswrapper[15202]: I0319 09:49:58.998899 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b34559e-c4bb-497f-9659-c0179f8010be-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:58.999081 master-0 kubenswrapper[15202]: I0319 09:49:58.998915 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrnr4\" (UniqueName: \"kubernetes.io/projected/4b34559e-c4bb-497f-9659-c0179f8010be-kube-api-access-vrnr4\") on node \"master-0\" DevicePath \"\"" Mar 19 09:49:59.687652 master-0 kubenswrapper[15202]: I0319 09:49:59.687599 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wpbkz" Mar 19 09:49:59.688307 master-0 kubenswrapper[15202]: I0319 09:49:59.687637 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wpbkz" event={"ID":"68cac917-044f-4346-81f4-336f070d2557","Type":"ContainerDied","Data":"211b06b231fd71afd6151fe207be386ae219d9beb2677c9581f9b23eb5c89dde"} Mar 19 09:49:59.688307 master-0 kubenswrapper[15202]: I0319 09:49:59.687729 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="211b06b231fd71afd6151fe207be386ae219d9beb2677c9581f9b23eb5c89dde" Mar 19 09:49:59.691388 master-0 kubenswrapper[15202]: I0319 09:49:59.691352 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16fb-account-create-update-8cp5c" event={"ID":"4b34559e-c4bb-497f-9659-c0179f8010be","Type":"ContainerDied","Data":"3006fcd8d2ed63911461dffd4fcb1cd4e3ee959569f43c0fd201bd34da1f7b47"} Mar 19 09:49:59.691476 master-0 kubenswrapper[15202]: I0319 09:49:59.691392 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3006fcd8d2ed63911461dffd4fcb1cd4e3ee959569f43c0fd201bd34da1f7b47" Mar 19 09:49:59.691712 master-0 kubenswrapper[15202]: I0319 09:49:59.691687 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16fb-account-create-update-8cp5c" Mar 19 09:49:59.706029 master-0 kubenswrapper[15202]: I0319 09:49:59.705943 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"d0523044265253b6af84dbe580ebf0b292c9a5f95b85e52258f612d15cff2284"} Mar 19 09:49:59.706029 master-0 kubenswrapper[15202]: I0319 09:49:59.706037 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"c6d616ad10c9c65c8425766542bb452cebce4d376c896c9bb597ecccbc3ad814"} Mar 19 09:50:00.737116 master-0 kubenswrapper[15202]: I0319 09:50:00.737043 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"fb00957523ccfbf8fec0662bb991a2b54d1041e9e91f589b083ba2178d7d9f78"} Mar 19 09:50:00.737116 master-0 kubenswrapper[15202]: I0319 09:50:00.737115 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"367b9f6208e79bc2b0aec110590872a23dcf9ce6747c861d4557114ae354ba0e"} Mar 19 09:50:00.770388 master-0 kubenswrapper[15202]: I0319 09:50:00.770239 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dh5fs"] Mar 19 09:50:00.770808 master-0 kubenswrapper[15202]: E0319 09:50:00.770769 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cac917-044f-4346-81f4-336f070d2557" containerName="mariadb-database-create" Mar 19 09:50:00.770808 master-0 kubenswrapper[15202]: I0319 09:50:00.770791 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cac917-044f-4346-81f4-336f070d2557" containerName="mariadb-database-create" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: E0319 09:50:00.770841 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8372380-9188-4c0f-9e75-05739d26a27c" containerName="swift-ring-rebalance" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: I0319 09:50:00.770849 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8372380-9188-4c0f-9e75-05739d26a27c" containerName="swift-ring-rebalance" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: E0319 09:50:00.770870 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b34559e-c4bb-497f-9659-c0179f8010be" containerName="mariadb-account-create-update" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: I0319 09:50:00.770877 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b34559e-c4bb-497f-9659-c0179f8010be" containerName="mariadb-account-create-update" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: E0319 09:50:00.770892 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19f30b4-71bc-486c-9fa7-306c8dac619b" containerName="mariadb-account-create-update" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: I0319 09:50:00.770899 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19f30b4-71bc-486c-9fa7-306c8dac619b" containerName="mariadb-account-create-update" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: E0319 09:50:00.770908 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6308b84-19d0-444a-9ef6-6ab67792a0c5" containerName="mariadb-database-create" Mar 19 09:50:00.770956 master-0 kubenswrapper[15202]: I0319 09:50:00.770915 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6308b84-19d0-444a-9ef6-6ab67792a0c5" containerName="mariadb-database-create" Mar 19 09:50:00.771281 master-0 kubenswrapper[15202]: I0319 09:50:00.771118 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cac917-044f-4346-81f4-336f070d2557" containerName="mariadb-database-create" Mar 19 09:50:00.771281 master-0 kubenswrapper[15202]: I0319 09:50:00.771149 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6308b84-19d0-444a-9ef6-6ab67792a0c5" containerName="mariadb-database-create" Mar 19 09:50:00.771281 master-0 kubenswrapper[15202]: I0319 09:50:00.771176 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8372380-9188-4c0f-9e75-05739d26a27c" containerName="swift-ring-rebalance" Mar 19 09:50:00.771281 master-0 kubenswrapper[15202]: I0319 09:50:00.771194 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b34559e-c4bb-497f-9659-c0179f8010be" containerName="mariadb-account-create-update" Mar 19 09:50:00.771281 master-0 kubenswrapper[15202]: I0319 09:50:00.771210 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19f30b4-71bc-486c-9fa7-306c8dac619b" containerName="mariadb-account-create-update" Mar 19 09:50:00.773018 master-0 kubenswrapper[15202]: I0319 09:50:00.772992 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:00.780838 master-0 kubenswrapper[15202]: I0319 09:50:00.780784 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 09:50:00.807084 master-0 kubenswrapper[15202]: I0319 09:50:00.797374 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dh5fs"] Mar 19 09:50:00.864670 master-0 kubenswrapper[15202]: I0319 09:50:00.864607 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wb2t\" (UniqueName: \"kubernetes.io/projected/7b308d6d-e494-4e76-8b8f-5c661340666b-kube-api-access-8wb2t\") pod \"root-account-create-update-dh5fs\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:00.864836 master-0 kubenswrapper[15202]: I0319 09:50:00.864736 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308d6d-e494-4e76-8b8f-5c661340666b-operator-scripts\") pod \"root-account-create-update-dh5fs\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:00.966794 master-0 kubenswrapper[15202]: I0319 09:50:00.966725 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wb2t\" (UniqueName: \"kubernetes.io/projected/7b308d6d-e494-4e76-8b8f-5c661340666b-kube-api-access-8wb2t\") pod \"root-account-create-update-dh5fs\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:00.966936 master-0 kubenswrapper[15202]: I0319 09:50:00.966870 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308d6d-e494-4e76-8b8f-5c661340666b-operator-scripts\") pod \"root-account-create-update-dh5fs\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:00.970502 master-0 kubenswrapper[15202]: I0319 09:50:00.970453 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308d6d-e494-4e76-8b8f-5c661340666b-operator-scripts\") pod \"root-account-create-update-dh5fs\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:00.988982 master-0 kubenswrapper[15202]: I0319 09:50:00.988911 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wb2t\" (UniqueName: \"kubernetes.io/projected/7b308d6d-e494-4e76-8b8f-5c661340666b-kube-api-access-8wb2t\") pod \"root-account-create-update-dh5fs\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:01.104299 master-0 kubenswrapper[15202]: I0319 09:50:01.104138 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:01.606932 master-0 kubenswrapper[15202]: I0319 09:50:01.606684 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dh5fs"] Mar 19 09:50:01.752007 master-0 kubenswrapper[15202]: I0319 09:50:01.751937 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dh5fs" event={"ID":"7b308d6d-e494-4e76-8b8f-5c661340666b","Type":"ContainerStarted","Data":"1504e0a128e89fe7c2afb83f2858869f2e9b43180c04b10077fcd6a65fac112d"} Mar 19 09:50:01.763266 master-0 kubenswrapper[15202]: I0319 09:50:01.763199 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"e4ebdad13c9b0e758c8cd287de7cf75d7a224e5bc3913609508c97c2c6e7ec5e"} Mar 19 09:50:01.763266 master-0 kubenswrapper[15202]: I0319 09:50:01.763264 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"a76eac9f34f21b772329c96aacde674e81ac9dacd453e5aadcd45673fdfa518d"} Mar 19 09:50:02.112363 master-0 kubenswrapper[15202]: I0319 09:50:02.112298 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 09:50:02.781728 master-0 kubenswrapper[15202]: I0319 09:50:02.781673 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"752a75415f1d9aa36a63f8be460af34744ec013730b3329dad350a235e1f56ba"} Mar 19 09:50:02.783537 master-0 kubenswrapper[15202]: I0319 09:50:02.783282 15202 generic.go:334] "Generic (PLEG): container finished" podID="7b308d6d-e494-4e76-8b8f-5c661340666b" containerID="c62c66022b5964dfe35ac5ca7b1a14f92fc70df12e6a5bfbd36cad4cccb3c538" exitCode=0 Mar 19 09:50:02.783537 master-0 kubenswrapper[15202]: I0319 09:50:02.783332 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dh5fs" event={"ID":"7b308d6d-e494-4e76-8b8f-5c661340666b","Type":"ContainerDied","Data":"c62c66022b5964dfe35ac5ca7b1a14f92fc70df12e6a5bfbd36cad4cccb3c538"} Mar 19 09:50:02.957817 master-0 kubenswrapper[15202]: I0319 09:50:02.957755 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:50:02.958882 master-0 kubenswrapper[15202]: I0319 09:50:02.958859 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sl66q" Mar 19 09:50:03.172561 master-0 kubenswrapper[15202]: I0319 09:50:03.171773 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m68fw" podUID="2ed2b7a9-27a6-43ac-ba84-ae1a7d670160" containerName="ovn-controller" probeResult="failure" output=< Mar 19 09:50:03.172561 master-0 kubenswrapper[15202]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 09:50:03.172561 master-0 kubenswrapper[15202]: > Mar 19 09:50:03.250577 master-0 kubenswrapper[15202]: I0319 09:50:03.250429 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m68fw-config-f49xm"] Mar 19 09:50:03.251842 master-0 kubenswrapper[15202]: I0319 09:50:03.251811 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.254965 master-0 kubenswrapper[15202]: I0319 09:50:03.254937 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 09:50:03.269595 master-0 kubenswrapper[15202]: I0319 09:50:03.269451 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m68fw-config-f49xm"] Mar 19 09:50:03.338888 master-0 kubenswrapper[15202]: I0319 09:50:03.338816 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mtk8\" (UniqueName: \"kubernetes.io/projected/8707bf06-8fff-40da-b16e-6c1700423045-kube-api-access-2mtk8\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.339142 master-0 kubenswrapper[15202]: I0319 09:50:03.338899 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-log-ovn\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.339142 master-0 kubenswrapper[15202]: I0319 09:50:03.338996 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run-ovn\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.339142 master-0 kubenswrapper[15202]: I0319 09:50:03.339025 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-additional-scripts\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.339142 master-0 kubenswrapper[15202]: I0319 09:50:03.339058 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-scripts\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.339142 master-0 kubenswrapper[15202]: I0319 09:50:03.339100 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.440760 master-0 kubenswrapper[15202]: I0319 09:50:03.440716 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-log-ovn\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.444073 master-0 kubenswrapper[15202]: I0319 09:50:03.443715 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-log-ovn\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.444534 master-0 kubenswrapper[15202]: I0319 09:50:03.444516 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run-ovn\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.445284 master-0 kubenswrapper[15202]: I0319 09:50:03.444858 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run-ovn\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.445378 master-0 kubenswrapper[15202]: I0319 09:50:03.445245 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-additional-scripts\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.446622 master-0 kubenswrapper[15202]: I0319 09:50:03.446565 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-scripts\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.449078 master-0 kubenswrapper[15202]: I0319 09:50:03.446745 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-additional-scripts\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.449817 master-0 kubenswrapper[15202]: I0319 09:50:03.449800 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.450358 master-0 kubenswrapper[15202]: I0319 09:50:03.450322 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mtk8\" (UniqueName: \"kubernetes.io/projected/8707bf06-8fff-40da-b16e-6c1700423045-kube-api-access-2mtk8\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.452811 master-0 kubenswrapper[15202]: I0319 09:50:03.452795 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.458499 master-0 kubenswrapper[15202]: I0319 09:50:03.458429 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-scripts\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.475503 master-0 kubenswrapper[15202]: I0319 09:50:03.475435 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mtk8\" (UniqueName: \"kubernetes.io/projected/8707bf06-8fff-40da-b16e-6c1700423045-kube-api-access-2mtk8\") pod \"ovn-controller-m68fw-config-f49xm\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.600777 master-0 kubenswrapper[15202]: I0319 09:50:03.600700 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:03.813230 master-0 kubenswrapper[15202]: I0319 09:50:03.811035 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"a7c6075a64016ffb2973b5d96a12f340c92f803a5ec5832c14dbd125856e1ec1"} Mar 19 09:50:03.813230 master-0 kubenswrapper[15202]: I0319 09:50:03.811181 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"7315e3f1ad9245f50fac62b1999de7dbd29aff8c465067d82a4f42d3b7434287"} Mar 19 09:50:08.153301 master-0 kubenswrapper[15202]: I0319 09:50:08.153172 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m68fw" podUID="2ed2b7a9-27a6-43ac-ba84-ae1a7d670160" containerName="ovn-controller" probeResult="failure" output=< Mar 19 09:50:08.153301 master-0 kubenswrapper[15202]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 09:50:08.153301 master-0 kubenswrapper[15202]: > Mar 19 09:50:10.777163 master-0 kubenswrapper[15202]: I0319 09:50:10.777142 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:10.907604 master-0 kubenswrapper[15202]: I0319 09:50:10.907439 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dh5fs" Mar 19 09:50:10.923907 master-0 kubenswrapper[15202]: I0319 09:50:10.923816 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dh5fs" event={"ID":"7b308d6d-e494-4e76-8b8f-5c661340666b","Type":"ContainerDied","Data":"1504e0a128e89fe7c2afb83f2858869f2e9b43180c04b10077fcd6a65fac112d"} Mar 19 09:50:10.924109 master-0 kubenswrapper[15202]: I0319 09:50:10.923932 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1504e0a128e89fe7c2afb83f2858869f2e9b43180c04b10077fcd6a65fac112d" Mar 19 09:50:10.979739 master-0 kubenswrapper[15202]: I0319 09:50:10.977578 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308d6d-e494-4e76-8b8f-5c661340666b-operator-scripts\") pod \"7b308d6d-e494-4e76-8b8f-5c661340666b\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " Mar 19 09:50:10.979739 master-0 kubenswrapper[15202]: I0319 09:50:10.977889 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wb2t\" (UniqueName: \"kubernetes.io/projected/7b308d6d-e494-4e76-8b8f-5c661340666b-kube-api-access-8wb2t\") pod \"7b308d6d-e494-4e76-8b8f-5c661340666b\" (UID: \"7b308d6d-e494-4e76-8b8f-5c661340666b\") " Mar 19 09:50:10.979739 master-0 kubenswrapper[15202]: I0319 09:50:10.978572 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b308d6d-e494-4e76-8b8f-5c661340666b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b308d6d-e494-4e76-8b8f-5c661340666b" (UID: "7b308d6d-e494-4e76-8b8f-5c661340666b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:10.979739 master-0 kubenswrapper[15202]: I0319 09:50:10.978936 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308d6d-e494-4e76-8b8f-5c661340666b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:10.988460 master-0 kubenswrapper[15202]: I0319 09:50:10.985576 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b308d6d-e494-4e76-8b8f-5c661340666b-kube-api-access-8wb2t" (OuterVolumeSpecName: "kube-api-access-8wb2t") pod "7b308d6d-e494-4e76-8b8f-5c661340666b" (UID: "7b308d6d-e494-4e76-8b8f-5c661340666b"). InnerVolumeSpecName "kube-api-access-8wb2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:11.080603 master-0 kubenswrapper[15202]: I0319 09:50:11.080550 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wb2t\" (UniqueName: \"kubernetes.io/projected/7b308d6d-e494-4e76-8b8f-5c661340666b-kube-api-access-8wb2t\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:11.138941 master-0 kubenswrapper[15202]: I0319 09:50:11.138720 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m68fw-config-f49xm"] Mar 19 09:50:11.141046 master-0 kubenswrapper[15202]: W0319 09:50:11.140852 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8707bf06_8fff_40da_b16e_6c1700423045.slice/crio-f2c09ef95794b119e4363f912b5cb24fd9516dbf050d77074e6d809f6d0820fb WatchSource:0}: Error finding container f2c09ef95794b119e4363f912b5cb24fd9516dbf050d77074e6d809f6d0820fb: Status 404 returned error can't find the container with id f2c09ef95794b119e4363f912b5cb24fd9516dbf050d77074e6d809f6d0820fb Mar 19 09:50:11.934323 master-0 kubenswrapper[15202]: I0319 09:50:11.934253 15202 generic.go:334] "Generic (PLEG): container finished" podID="8707bf06-8fff-40da-b16e-6c1700423045" containerID="4b46b44ed883040d9b5cc4af5e2c5fbbdb384f3131b50be056f41674e66144a8" exitCode=0 Mar 19 09:50:11.934323 master-0 kubenswrapper[15202]: I0319 09:50:11.934306 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw-config-f49xm" event={"ID":"8707bf06-8fff-40da-b16e-6c1700423045","Type":"ContainerDied","Data":"4b46b44ed883040d9b5cc4af5e2c5fbbdb384f3131b50be056f41674e66144a8"} Mar 19 09:50:11.934990 master-0 kubenswrapper[15202]: I0319 09:50:11.934369 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw-config-f49xm" event={"ID":"8707bf06-8fff-40da-b16e-6c1700423045","Type":"ContainerStarted","Data":"f2c09ef95794b119e4363f912b5cb24fd9516dbf050d77074e6d809f6d0820fb"} Mar 19 09:50:11.937259 master-0 kubenswrapper[15202]: I0319 09:50:11.937215 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zxw2c" event={"ID":"c9bce886-0af6-432e-ab68-b4af30a4defd","Type":"ContainerStarted","Data":"c1b707bc875ad8212e30e4e8f547814d3aaefa608648a913c592528a62709843"} Mar 19 09:50:11.944493 master-0 kubenswrapper[15202]: I0319 09:50:11.944309 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"93735b6372faafdc8b066412d3876b2270c836015c8c28701b988c8e7513dd9d"} Mar 19 09:50:11.944493 master-0 kubenswrapper[15202]: I0319 09:50:11.944345 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"e20d5bd8e805276e087d48c6e0762e4027b7d427c08f9bc9db571ba8f122def7"} Mar 19 09:50:11.944493 master-0 kubenswrapper[15202]: I0319 09:50:11.944355 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"d309eabf720f69a6f3fa1af19b612f463ed5f1d95000d857c580f2946b970ce7"} Mar 19 09:50:11.944493 master-0 kubenswrapper[15202]: I0319 09:50:11.944364 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d9c99748-0ca1-4e25-947a-801c2e8748f5","Type":"ContainerStarted","Data":"5af43870802c46ac75e5d88d3f008b7d01d7d4fd84c9c8a10d082cdfd13b1067"} Mar 19 09:50:12.014574 master-0 kubenswrapper[15202]: I0319 09:50:12.014093 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=28.887631179 podStartE2EDuration="35.014077371s" podCreationTimestamp="2026-03-19 09:49:37 +0000 UTC" firstStartedPulling="2026-03-19 09:49:56.253627667 +0000 UTC m=+1513.639042483" lastFinishedPulling="2026-03-19 09:50:02.380073859 +0000 UTC m=+1519.765488675" observedRunningTime="2026-03-19 09:50:12.013064756 +0000 UTC m=+1529.398479582" watchObservedRunningTime="2026-03-19 09:50:12.014077371 +0000 UTC m=+1529.399492187" Mar 19 09:50:12.044495 master-0 kubenswrapper[15202]: I0319 09:50:12.040414 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zxw2c" podStartSLOduration=5.018972692 podStartE2EDuration="20.04039708s" podCreationTimestamp="2026-03-19 09:49:52 +0000 UTC" firstStartedPulling="2026-03-19 09:49:55.757860826 +0000 UTC m=+1513.143275642" lastFinishedPulling="2026-03-19 09:50:10.779285214 +0000 UTC m=+1528.164700030" observedRunningTime="2026-03-19 09:50:12.033697695 +0000 UTC m=+1529.419112531" watchObservedRunningTime="2026-03-19 09:50:12.04039708 +0000 UTC m=+1529.425811896" Mar 19 09:50:12.355304 master-0 kubenswrapper[15202]: I0319 09:50:12.352713 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9748bd58f-s2fbq"] Mar 19 09:50:12.355304 master-0 kubenswrapper[15202]: E0319 09:50:12.353208 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b308d6d-e494-4e76-8b8f-5c661340666b" containerName="mariadb-account-create-update" Mar 19 09:50:12.355304 master-0 kubenswrapper[15202]: I0319 09:50:12.353222 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b308d6d-e494-4e76-8b8f-5c661340666b" containerName="mariadb-account-create-update" Mar 19 09:50:12.355304 master-0 kubenswrapper[15202]: I0319 09:50:12.353427 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b308d6d-e494-4e76-8b8f-5c661340666b" containerName="mariadb-account-create-update" Mar 19 09:50:12.370494 master-0 kubenswrapper[15202]: I0319 09:50:12.366642 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.371499 master-0 kubenswrapper[15202]: I0319 09:50:12.370901 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 09:50:12.377694 master-0 kubenswrapper[15202]: I0319 09:50:12.377621 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9748bd58f-s2fbq"] Mar 19 09:50:12.514508 master-0 kubenswrapper[15202]: I0319 09:50:12.513197 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.514508 master-0 kubenswrapper[15202]: I0319 09:50:12.513315 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-swift-storage-0\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.514508 master-0 kubenswrapper[15202]: I0319 09:50:12.513371 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-config\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.514508 master-0 kubenswrapper[15202]: I0319 09:50:12.513418 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.514508 master-0 kubenswrapper[15202]: I0319 09:50:12.513518 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-svc\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.514508 master-0 kubenswrapper[15202]: I0319 09:50:12.513576 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvbn\" (UniqueName: \"kubernetes.io/projected/5c700b42-1e60-4ea7-9837-c7474f999c0b-kube-api-access-clvbn\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.615865 master-0 kubenswrapper[15202]: I0319 09:50:12.615720 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-swift-storage-0\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.615865 master-0 kubenswrapper[15202]: I0319 09:50:12.615856 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-config\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.616115 master-0 kubenswrapper[15202]: I0319 09:50:12.615919 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.616115 master-0 kubenswrapper[15202]: I0319 09:50:12.615997 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-svc\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.616270 master-0 kubenswrapper[15202]: I0319 09:50:12.616232 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvbn\" (UniqueName: \"kubernetes.io/projected/5c700b42-1e60-4ea7-9837-c7474f999c0b-kube-api-access-clvbn\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.616369 master-0 kubenswrapper[15202]: I0319 09:50:12.616344 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.616769 master-0 kubenswrapper[15202]: I0319 09:50:12.616730 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-swift-storage-0\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.616904 master-0 kubenswrapper[15202]: I0319 09:50:12.616873 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.617276 master-0 kubenswrapper[15202]: I0319 09:50:12.617244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-svc\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.617411 master-0 kubenswrapper[15202]: I0319 09:50:12.617380 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.617619 master-0 kubenswrapper[15202]: I0319 09:50:12.617580 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-config\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.634692 master-0 kubenswrapper[15202]: I0319 09:50:12.634635 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvbn\" (UniqueName: \"kubernetes.io/projected/5c700b42-1e60-4ea7-9837-c7474f999c0b-kube-api-access-clvbn\") pod \"dnsmasq-dns-9748bd58f-s2fbq\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.692800 master-0 kubenswrapper[15202]: I0319 09:50:12.692741 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:12.698676 master-0 kubenswrapper[15202]: I0319 09:50:12.698633 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:50:13.165935 master-0 kubenswrapper[15202]: I0319 09:50:13.165869 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m68fw" Mar 19 09:50:13.325508 master-0 kubenswrapper[15202]: I0319 09:50:13.323854 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9748bd58f-s2fbq"] Mar 19 09:50:13.495105 master-0 kubenswrapper[15202]: I0319 09:50:13.494650 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540519 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-additional-scripts\") pod \"8707bf06-8fff-40da-b16e-6c1700423045\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540600 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run-ovn\") pod \"8707bf06-8fff-40da-b16e-6c1700423045\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540672 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mtk8\" (UniqueName: \"kubernetes.io/projected/8707bf06-8fff-40da-b16e-6c1700423045-kube-api-access-2mtk8\") pod \"8707bf06-8fff-40da-b16e-6c1700423045\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540746 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-log-ovn\") pod \"8707bf06-8fff-40da-b16e-6c1700423045\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540822 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run\") pod \"8707bf06-8fff-40da-b16e-6c1700423045\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540837 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8707bf06-8fff-40da-b16e-6c1700423045" (UID: "8707bf06-8fff-40da-b16e-6c1700423045"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540869 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-scripts\") pod \"8707bf06-8fff-40da-b16e-6c1700423045\" (UID: \"8707bf06-8fff-40da-b16e-6c1700423045\") " Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.540905 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8707bf06-8fff-40da-b16e-6c1700423045" (UID: "8707bf06-8fff-40da-b16e-6c1700423045"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.541273 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8707bf06-8fff-40da-b16e-6c1700423045" (UID: "8707bf06-8fff-40da-b16e-6c1700423045"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:13.541328 master-0 kubenswrapper[15202]: I0319 09:50:13.541260 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run" (OuterVolumeSpecName: "var-run") pod "8707bf06-8fff-40da-b16e-6c1700423045" (UID: "8707bf06-8fff-40da-b16e-6c1700423045"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:50:13.541849 master-0 kubenswrapper[15202]: I0319 09:50:13.541413 15202 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:13.541849 master-0 kubenswrapper[15202]: I0319 09:50:13.541439 15202 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:13.542383 master-0 kubenswrapper[15202]: I0319 09:50:13.542323 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-scripts" (OuterVolumeSpecName: "scripts") pod "8707bf06-8fff-40da-b16e-6c1700423045" (UID: "8707bf06-8fff-40da-b16e-6c1700423045"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:13.547846 master-0 kubenswrapper[15202]: I0319 09:50:13.547777 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8707bf06-8fff-40da-b16e-6c1700423045-kube-api-access-2mtk8" (OuterVolumeSpecName: "kube-api-access-2mtk8") pod "8707bf06-8fff-40da-b16e-6c1700423045" (UID: "8707bf06-8fff-40da-b16e-6c1700423045"). InnerVolumeSpecName "kube-api-access-2mtk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:13.643674 master-0 kubenswrapper[15202]: I0319 09:50:13.643623 15202 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:13.643674 master-0 kubenswrapper[15202]: I0319 09:50:13.643665 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mtk8\" (UniqueName: \"kubernetes.io/projected/8707bf06-8fff-40da-b16e-6c1700423045-kube-api-access-2mtk8\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:13.643674 master-0 kubenswrapper[15202]: I0319 09:50:13.643678 15202 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8707bf06-8fff-40da-b16e-6c1700423045-var-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:13.643674 master-0 kubenswrapper[15202]: I0319 09:50:13.643688 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8707bf06-8fff-40da-b16e-6c1700423045-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:13.978551 master-0 kubenswrapper[15202]: I0319 09:50:13.975512 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw-config-f49xm" event={"ID":"8707bf06-8fff-40da-b16e-6c1700423045","Type":"ContainerDied","Data":"f2c09ef95794b119e4363f912b5cb24fd9516dbf050d77074e6d809f6d0820fb"} Mar 19 09:50:13.978551 master-0 kubenswrapper[15202]: I0319 09:50:13.975560 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c09ef95794b119e4363f912b5cb24fd9516dbf050d77074e6d809f6d0820fb" Mar 19 09:50:13.978551 master-0 kubenswrapper[15202]: I0319 09:50:13.975627 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-f49xm" Mar 19 09:50:13.985950 master-0 kubenswrapper[15202]: I0319 09:50:13.985897 15202 generic.go:334] "Generic (PLEG): container finished" podID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerID="e81aa67c41b386c44d8f394d64776510502d57467e8fc1b7480f4aff25c1565f" exitCode=0 Mar 19 09:50:13.985950 master-0 kubenswrapper[15202]: I0319 09:50:13.985948 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" event={"ID":"5c700b42-1e60-4ea7-9837-c7474f999c0b","Type":"ContainerDied","Data":"e81aa67c41b386c44d8f394d64776510502d57467e8fc1b7480f4aff25c1565f"} Mar 19 09:50:13.986235 master-0 kubenswrapper[15202]: I0319 09:50:13.985976 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" event={"ID":"5c700b42-1e60-4ea7-9837-c7474f999c0b","Type":"ContainerStarted","Data":"faa3f4e96a853131332c6e18bec61e1381293ac570d597ce371817af8cc2477f"} Mar 19 09:50:14.216848 master-0 kubenswrapper[15202]: I0319 09:50:14.216789 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 09:50:14.630653 master-0 kubenswrapper[15202]: I0319 09:50:14.628696 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m68fw-config-f49xm"] Mar 19 09:50:14.653595 master-0 kubenswrapper[15202]: I0319 09:50:14.650165 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m68fw-config-f49xm"] Mar 19 09:50:14.803672 master-0 kubenswrapper[15202]: I0319 09:50:14.803592 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8nppp"] Mar 19 09:50:14.804192 master-0 kubenswrapper[15202]: E0319 09:50:14.804158 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8707bf06-8fff-40da-b16e-6c1700423045" containerName="ovn-config" Mar 19 09:50:14.804192 master-0 kubenswrapper[15202]: I0319 09:50:14.804184 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8707bf06-8fff-40da-b16e-6c1700423045" containerName="ovn-config" Mar 19 09:50:14.804614 master-0 kubenswrapper[15202]: I0319 09:50:14.804530 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8707bf06-8fff-40da-b16e-6c1700423045" containerName="ovn-config" Mar 19 09:50:14.805439 master-0 kubenswrapper[15202]: I0319 09:50:14.805404 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:14.846227 master-0 kubenswrapper[15202]: I0319 09:50:14.846122 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8707bf06-8fff-40da-b16e-6c1700423045" path="/var/lib/kubelet/pods/8707bf06-8fff-40da-b16e-6c1700423045/volumes" Mar 19 09:50:14.855660 master-0 kubenswrapper[15202]: I0319 09:50:14.854004 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8nppp"] Mar 19 09:50:14.855660 master-0 kubenswrapper[15202]: I0319 09:50:14.855406 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m68fw-config-tzm6j"] Mar 19 09:50:14.860737 master-0 kubenswrapper[15202]: I0319 09:50:14.857373 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.866339 master-0 kubenswrapper[15202]: I0319 09:50:14.865938 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 09:50:14.883578 master-0 kubenswrapper[15202]: I0319 09:50:14.880844 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkgf\" (UniqueName: \"kubernetes.io/projected/482a5739-51df-41c5-89b0-8ea2e82cee8a-kube-api-access-xjkgf\") pod \"cinder-db-create-8nppp\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:14.883578 master-0 kubenswrapper[15202]: I0319 09:50:14.880966 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/482a5739-51df-41c5-89b0-8ea2e82cee8a-operator-scripts\") pod \"cinder-db-create-8nppp\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:14.900209 master-0 kubenswrapper[15202]: I0319 09:50:14.897072 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m68fw-config-tzm6j"] Mar 19 09:50:14.916980 master-0 kubenswrapper[15202]: I0319 09:50:14.916923 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3735-account-create-update-59xbx"] Mar 19 09:50:14.926782 master-0 kubenswrapper[15202]: I0319 09:50:14.926729 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:14.929116 master-0 kubenswrapper[15202]: I0319 09:50:14.929057 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 09:50:14.982991 master-0 kubenswrapper[15202]: I0319 09:50:14.982869 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-log-ovn\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.983300 master-0 kubenswrapper[15202]: I0319 09:50:14.983279 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run-ovn\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.983521 master-0 kubenswrapper[15202]: I0319 09:50:14.983504 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.983634 master-0 kubenswrapper[15202]: I0319 09:50:14.983619 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-scripts\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.983755 master-0 kubenswrapper[15202]: I0319 09:50:14.983739 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-additional-scripts\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.983871 master-0 kubenswrapper[15202]: I0319 09:50:14.983857 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkjb\" (UniqueName: \"kubernetes.io/projected/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-kube-api-access-nlkjb\") pod \"cinder-3735-account-create-update-59xbx\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:14.983977 master-0 kubenswrapper[15202]: I0319 09:50:14.983964 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkgf\" (UniqueName: \"kubernetes.io/projected/482a5739-51df-41c5-89b0-8ea2e82cee8a-kube-api-access-xjkgf\") pod \"cinder-db-create-8nppp\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:14.984059 master-0 kubenswrapper[15202]: I0319 09:50:14.984047 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd7qk\" (UniqueName: \"kubernetes.io/projected/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-kube-api-access-xd7qk\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:14.984219 master-0 kubenswrapper[15202]: I0319 09:50:14.984203 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-operator-scripts\") pod \"cinder-3735-account-create-update-59xbx\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:14.984316 master-0 kubenswrapper[15202]: I0319 09:50:14.984300 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/482a5739-51df-41c5-89b0-8ea2e82cee8a-operator-scripts\") pod \"cinder-db-create-8nppp\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:14.985449 master-0 kubenswrapper[15202]: I0319 09:50:14.985425 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/482a5739-51df-41c5-89b0-8ea2e82cee8a-operator-scripts\") pod \"cinder-db-create-8nppp\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:15.003365 master-0 kubenswrapper[15202]: I0319 09:50:15.003291 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3735-account-create-update-59xbx"] Mar 19 09:50:15.017495 master-0 kubenswrapper[15202]: I0319 09:50:15.014105 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkgf\" (UniqueName: \"kubernetes.io/projected/482a5739-51df-41c5-89b0-8ea2e82cee8a-kube-api-access-xjkgf\") pod \"cinder-db-create-8nppp\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:15.026900 master-0 kubenswrapper[15202]: I0319 09:50:15.026860 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" event={"ID":"5c700b42-1e60-4ea7-9837-c7474f999c0b","Type":"ContainerStarted","Data":"a44cdcbb716944e9e26b3e2360e1a17368c0d236d50cf6c35c4f60a21d2a6ba0"} Mar 19 09:50:15.027675 master-0 kubenswrapper[15202]: I0319 09:50:15.027626 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:15.061905 master-0 kubenswrapper[15202]: I0319 09:50:15.058141 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" podStartSLOduration=3.058116245 podStartE2EDuration="3.058116245s" podCreationTimestamp="2026-03-19 09:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:15.048646251 +0000 UTC m=+1532.434061087" watchObservedRunningTime="2026-03-19 09:50:15.058116245 +0000 UTC m=+1532.443531061" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086110 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run-ovn\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086209 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086256 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-scripts\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086291 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-additional-scripts\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086336 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkjb\" (UniqueName: \"kubernetes.io/projected/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-kube-api-access-nlkjb\") pod \"cinder-3735-account-create-update-59xbx\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086384 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd7qk\" (UniqueName: \"kubernetes.io/projected/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-kube-api-access-xd7qk\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086426 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-operator-scripts\") pod \"cinder-3735-account-create-update-59xbx\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086495 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-log-ovn\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.086894 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run-ovn\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.087202 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.089330 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-scripts\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.090603 master-0 kubenswrapper[15202]: I0319 09:50:15.089760 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-additional-scripts\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.091239 master-0 kubenswrapper[15202]: I0319 09:50:15.090976 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-operator-scripts\") pod \"cinder-3735-account-create-update-59xbx\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:15.091239 master-0 kubenswrapper[15202]: I0319 09:50:15.091167 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-log-ovn\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.111312 master-0 kubenswrapper[15202]: I0319 09:50:15.111263 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd7qk\" (UniqueName: \"kubernetes.io/projected/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-kube-api-access-xd7qk\") pod \"ovn-controller-m68fw-config-tzm6j\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.115904 master-0 kubenswrapper[15202]: I0319 09:50:15.115870 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkjb\" (UniqueName: \"kubernetes.io/projected/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-kube-api-access-nlkjb\") pod \"cinder-3735-account-create-update-59xbx\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:15.136738 master-0 kubenswrapper[15202]: I0319 09:50:15.136608 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:15.274302 master-0 kubenswrapper[15202]: I0319 09:50:15.274249 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:15.290717 master-0 kubenswrapper[15202]: I0319 09:50:15.290669 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:15.297228 master-0 kubenswrapper[15202]: I0319 09:50:15.296442 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wjhhn"] Mar 19 09:50:15.297868 master-0 kubenswrapper[15202]: I0319 09:50:15.297842 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.314541 master-0 kubenswrapper[15202]: I0319 09:50:15.314507 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vk8gz"] Mar 19 09:50:15.316163 master-0 kubenswrapper[15202]: I0319 09:50:15.316138 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.320164 master-0 kubenswrapper[15202]: I0319 09:50:15.319364 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:50:15.320164 master-0 kubenswrapper[15202]: I0319 09:50:15.319600 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:50:15.320734 master-0 kubenswrapper[15202]: I0319 09:50:15.320719 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:50:15.414046 master-0 kubenswrapper[15202]: I0319 09:50:15.413911 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wjhhn"] Mar 19 09:50:15.446505 master-0 kubenswrapper[15202]: I0319 09:50:15.436912 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vk8gz"] Mar 19 09:50:15.508800 master-0 kubenswrapper[15202]: I0319 09:50:15.505253 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-886c-account-create-update-24ntn"] Mar 19 09:50:15.508800 master-0 kubenswrapper[15202]: I0319 09:50:15.507922 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.519510 master-0 kubenswrapper[15202]: I0319 09:50:15.512817 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 09:50:15.532993 master-0 kubenswrapper[15202]: I0319 09:50:15.520121 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/310506d0-94fa-4573-a29a-5ddd012b6e64-operator-scripts\") pod \"neutron-db-create-wjhhn\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.532993 master-0 kubenswrapper[15202]: I0319 09:50:15.520221 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29s2q\" (UniqueName: \"kubernetes.io/projected/310506d0-94fa-4573-a29a-5ddd012b6e64-kube-api-access-29s2q\") pod \"neutron-db-create-wjhhn\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.532993 master-0 kubenswrapper[15202]: I0319 09:50:15.520312 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrcq\" (UniqueName: \"kubernetes.io/projected/9fefe00c-9546-4205-a4e5-a73e807d6bf4-kube-api-access-pmrcq\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.532993 master-0 kubenswrapper[15202]: I0319 09:50:15.521442 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.532993 master-0 kubenswrapper[15202]: I0319 09:50:15.521521 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-config-data\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.532993 master-0 kubenswrapper[15202]: I0319 09:50:15.527310 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-886c-account-create-update-24ntn"] Mar 19 09:50:15.627758 master-0 kubenswrapper[15202]: I0319 09:50:15.627705 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.628058 master-0 kubenswrapper[15202]: I0319 09:50:15.628033 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-config-data\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.628233 master-0 kubenswrapper[15202]: I0319 09:50:15.628212 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4nz\" (UniqueName: \"kubernetes.io/projected/3c485a6b-17be-4afe-8110-a57f77347be1-kube-api-access-7r4nz\") pod \"neutron-886c-account-create-update-24ntn\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.628352 master-0 kubenswrapper[15202]: I0319 09:50:15.628334 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c485a6b-17be-4afe-8110-a57f77347be1-operator-scripts\") pod \"neutron-886c-account-create-update-24ntn\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.628567 master-0 kubenswrapper[15202]: I0319 09:50:15.628543 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/310506d0-94fa-4573-a29a-5ddd012b6e64-operator-scripts\") pod \"neutron-db-create-wjhhn\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.628718 master-0 kubenswrapper[15202]: I0319 09:50:15.628697 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29s2q\" (UniqueName: \"kubernetes.io/projected/310506d0-94fa-4573-a29a-5ddd012b6e64-kube-api-access-29s2q\") pod \"neutron-db-create-wjhhn\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.628965 master-0 kubenswrapper[15202]: I0319 09:50:15.628944 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrcq\" (UniqueName: \"kubernetes.io/projected/9fefe00c-9546-4205-a4e5-a73e807d6bf4-kube-api-access-pmrcq\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.630108 master-0 kubenswrapper[15202]: I0319 09:50:15.630091 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/310506d0-94fa-4573-a29a-5ddd012b6e64-operator-scripts\") pod \"neutron-db-create-wjhhn\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.642078 master-0 kubenswrapper[15202]: I0319 09:50:15.636335 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-config-data\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.642923 master-0 kubenswrapper[15202]: I0319 09:50:15.637949 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.648809 master-0 kubenswrapper[15202]: I0319 09:50:15.648763 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29s2q\" (UniqueName: \"kubernetes.io/projected/310506d0-94fa-4573-a29a-5ddd012b6e64-kube-api-access-29s2q\") pod \"neutron-db-create-wjhhn\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:15.650960 master-0 kubenswrapper[15202]: I0319 09:50:15.650897 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrcq\" (UniqueName: \"kubernetes.io/projected/9fefe00c-9546-4205-a4e5-a73e807d6bf4-kube-api-access-pmrcq\") pod \"keystone-db-sync-vk8gz\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.716751 master-0 kubenswrapper[15202]: I0319 09:50:15.716560 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:15.730147 master-0 kubenswrapper[15202]: I0319 09:50:15.730034 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4nz\" (UniqueName: \"kubernetes.io/projected/3c485a6b-17be-4afe-8110-a57f77347be1-kube-api-access-7r4nz\") pod \"neutron-886c-account-create-update-24ntn\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.730147 master-0 kubenswrapper[15202]: I0319 09:50:15.730106 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c485a6b-17be-4afe-8110-a57f77347be1-operator-scripts\") pod \"neutron-886c-account-create-update-24ntn\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.731767 master-0 kubenswrapper[15202]: I0319 09:50:15.731713 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c485a6b-17be-4afe-8110-a57f77347be1-operator-scripts\") pod \"neutron-886c-account-create-update-24ntn\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.758778 master-0 kubenswrapper[15202]: I0319 09:50:15.758721 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4nz\" (UniqueName: \"kubernetes.io/projected/3c485a6b-17be-4afe-8110-a57f77347be1-kube-api-access-7r4nz\") pod \"neutron-886c-account-create-update-24ntn\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.846735 master-0 kubenswrapper[15202]: I0319 09:50:15.846671 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8nppp"] Mar 19 09:50:15.898885 master-0 kubenswrapper[15202]: I0319 09:50:15.898827 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:15.963090 master-0 kubenswrapper[15202]: I0319 09:50:15.961152 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:16.029602 master-0 kubenswrapper[15202]: I0319 09:50:16.018021 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m68fw-config-tzm6j"] Mar 19 09:50:16.048496 master-0 kubenswrapper[15202]: W0319 09:50:16.048416 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6c9f92b_f7c6_41b1_8260_f7f33c16fc1f.slice/crio-933b8fec9f1400c97de685736abfaa1fffd92405978f0138aeea6235e7bd3461 WatchSource:0}: Error finding container 933b8fec9f1400c97de685736abfaa1fffd92405978f0138aeea6235e7bd3461: Status 404 returned error can't find the container with id 933b8fec9f1400c97de685736abfaa1fffd92405978f0138aeea6235e7bd3461 Mar 19 09:50:16.062241 master-0 kubenswrapper[15202]: I0319 09:50:16.062148 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3735-account-create-update-59xbx"] Mar 19 09:50:16.067174 master-0 kubenswrapper[15202]: I0319 09:50:16.066748 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8nppp" event={"ID":"482a5739-51df-41c5-89b0-8ea2e82cee8a","Type":"ContainerStarted","Data":"863a2230400dad99fe75bdc361d00d8e83a9df5751e91711191684b15cfb07a0"} Mar 19 09:50:16.346095 master-0 kubenswrapper[15202]: I0319 09:50:16.344883 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vk8gz"] Mar 19 09:50:16.602777 master-0 kubenswrapper[15202]: I0319 09:50:16.601949 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-886c-account-create-update-24ntn"] Mar 19 09:50:16.620148 master-0 kubenswrapper[15202]: I0319 09:50:16.619075 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wjhhn"] Mar 19 09:50:16.681858 master-0 kubenswrapper[15202]: E0319 09:50:16.680113 15202 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:34888->192.168.32.10:35219: write tcp 192.168.32.10:34888->192.168.32.10:35219: write: broken pipe Mar 19 09:50:17.082320 master-0 kubenswrapper[15202]: I0319 09:50:17.082236 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wjhhn" event={"ID":"310506d0-94fa-4573-a29a-5ddd012b6e64","Type":"ContainerStarted","Data":"4fcad111ea9d40bd2336613c4152e6d793f8b7459f474e8bfe24622587250403"} Mar 19 09:50:17.082320 master-0 kubenswrapper[15202]: I0319 09:50:17.082308 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wjhhn" event={"ID":"310506d0-94fa-4573-a29a-5ddd012b6e64","Type":"ContainerStarted","Data":"62b0104bd9a893afbfa9ca94eecace9ca77ca6f2ed4bf3c250908cd7a6955d3e"} Mar 19 09:50:17.084409 master-0 kubenswrapper[15202]: I0319 09:50:17.084367 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vk8gz" event={"ID":"9fefe00c-9546-4205-a4e5-a73e807d6bf4","Type":"ContainerStarted","Data":"49204e09f1e0033d5afb6395c670c144fedc912d60356abda5e5ea20969b8f15"} Mar 19 09:50:17.086139 master-0 kubenswrapper[15202]: I0319 09:50:17.086096 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-886c-account-create-update-24ntn" event={"ID":"3c485a6b-17be-4afe-8110-a57f77347be1","Type":"ContainerStarted","Data":"9d9eca4b3f22a2760500a2b0f6800524d7e8068e31b227e0c275e5169dc7a91d"} Mar 19 09:50:17.086209 master-0 kubenswrapper[15202]: I0319 09:50:17.086138 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-886c-account-create-update-24ntn" event={"ID":"3c485a6b-17be-4afe-8110-a57f77347be1","Type":"ContainerStarted","Data":"17bdab900e1443663225c0d1cabbb01cc64346b3f1386ab88236cc2c2e667a5e"} Mar 19 09:50:17.088684 master-0 kubenswrapper[15202]: I0319 09:50:17.088641 15202 generic.go:334] "Generic (PLEG): container finished" podID="592e718b-4b77-4ef3-8ee0-e7ce98415e3c" containerID="16cfa87650ea68a18704ad490c72a2cc838c467216e77480298c8d54d6c0a2b3" exitCode=0 Mar 19 09:50:17.091704 master-0 kubenswrapper[15202]: I0319 09:50:17.091642 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3735-account-create-update-59xbx" event={"ID":"592e718b-4b77-4ef3-8ee0-e7ce98415e3c","Type":"ContainerDied","Data":"16cfa87650ea68a18704ad490c72a2cc838c467216e77480298c8d54d6c0a2b3"} Mar 19 09:50:17.091812 master-0 kubenswrapper[15202]: I0319 09:50:17.091726 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3735-account-create-update-59xbx" event={"ID":"592e718b-4b77-4ef3-8ee0-e7ce98415e3c","Type":"ContainerStarted","Data":"b7cc05f29a9cb878a2b1b6791008ce62333cd5d5b2c7f02a65f27c15fea92d4b"} Mar 19 09:50:17.097026 master-0 kubenswrapper[15202]: I0319 09:50:17.096516 15202 generic.go:334] "Generic (PLEG): container finished" podID="b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" containerID="5a63a529310cf423d9ea418b0b8e064e9c7716971f0f72c8873aa5512c01c530" exitCode=0 Mar 19 09:50:17.097026 master-0 kubenswrapper[15202]: I0319 09:50:17.096693 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw-config-tzm6j" event={"ID":"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f","Type":"ContainerDied","Data":"5a63a529310cf423d9ea418b0b8e064e9c7716971f0f72c8873aa5512c01c530"} Mar 19 09:50:17.097026 master-0 kubenswrapper[15202]: I0319 09:50:17.096721 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw-config-tzm6j" event={"ID":"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f","Type":"ContainerStarted","Data":"933b8fec9f1400c97de685736abfaa1fffd92405978f0138aeea6235e7bd3461"} Mar 19 09:50:17.115717 master-0 kubenswrapper[15202]: I0319 09:50:17.107598 15202 generic.go:334] "Generic (PLEG): container finished" podID="482a5739-51df-41c5-89b0-8ea2e82cee8a" containerID="fd52e98f6b5e95065c9e493daa1210887d0bb74539312b3915d6c7cf6725b9d7" exitCode=0 Mar 19 09:50:17.115717 master-0 kubenswrapper[15202]: I0319 09:50:17.107661 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8nppp" event={"ID":"482a5739-51df-41c5-89b0-8ea2e82cee8a","Type":"ContainerDied","Data":"fd52e98f6b5e95065c9e493daa1210887d0bb74539312b3915d6c7cf6725b9d7"} Mar 19 09:50:17.115717 master-0 kubenswrapper[15202]: I0319 09:50:17.112088 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-wjhhn" podStartSLOduration=2.112068793 podStartE2EDuration="2.112068793s" podCreationTimestamp="2026-03-19 09:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:17.1030199 +0000 UTC m=+1534.488434716" watchObservedRunningTime="2026-03-19 09:50:17.112068793 +0000 UTC m=+1534.497483629" Mar 19 09:50:17.159554 master-0 kubenswrapper[15202]: I0319 09:50:17.155112 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-886c-account-create-update-24ntn" podStartSLOduration=2.155091854 podStartE2EDuration="2.155091854s" podCreationTimestamp="2026-03-19 09:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:17.142162825 +0000 UTC m=+1534.527577641" watchObservedRunningTime="2026-03-19 09:50:17.155091854 +0000 UTC m=+1534.540506670" Mar 19 09:50:18.125809 master-0 kubenswrapper[15202]: I0319 09:50:18.125726 15202 generic.go:334] "Generic (PLEG): container finished" podID="310506d0-94fa-4573-a29a-5ddd012b6e64" containerID="4fcad111ea9d40bd2336613c4152e6d793f8b7459f474e8bfe24622587250403" exitCode=0 Mar 19 09:50:18.126283 master-0 kubenswrapper[15202]: I0319 09:50:18.125825 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wjhhn" event={"ID":"310506d0-94fa-4573-a29a-5ddd012b6e64","Type":"ContainerDied","Data":"4fcad111ea9d40bd2336613c4152e6d793f8b7459f474e8bfe24622587250403"} Mar 19 09:50:18.134190 master-0 kubenswrapper[15202]: I0319 09:50:18.133213 15202 generic.go:334] "Generic (PLEG): container finished" podID="3c485a6b-17be-4afe-8110-a57f77347be1" containerID="9d9eca4b3f22a2760500a2b0f6800524d7e8068e31b227e0c275e5169dc7a91d" exitCode=0 Mar 19 09:50:18.134190 master-0 kubenswrapper[15202]: I0319 09:50:18.133510 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-886c-account-create-update-24ntn" event={"ID":"3c485a6b-17be-4afe-8110-a57f77347be1","Type":"ContainerDied","Data":"9d9eca4b3f22a2760500a2b0f6800524d7e8068e31b227e0c275e5169dc7a91d"} Mar 19 09:50:21.634325 master-0 kubenswrapper[15202]: I0319 09:50:21.634228 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:21.641090 master-0 kubenswrapper[15202]: I0319 09:50:21.641061 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:21.692641 master-0 kubenswrapper[15202]: I0319 09:50:21.692563 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:21.701331 master-0 kubenswrapper[15202]: I0319 09:50:21.701273 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:21.746335 master-0 kubenswrapper[15202]: I0319 09:50:21.745698 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:21.770507 master-0 kubenswrapper[15202]: I0319 09:50:21.770135 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29s2q\" (UniqueName: \"kubernetes.io/projected/310506d0-94fa-4573-a29a-5ddd012b6e64-kube-api-access-29s2q\") pod \"310506d0-94fa-4573-a29a-5ddd012b6e64\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " Mar 19 09:50:21.770507 master-0 kubenswrapper[15202]: I0319 09:50:21.770287 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjkgf\" (UniqueName: \"kubernetes.io/projected/482a5739-51df-41c5-89b0-8ea2e82cee8a-kube-api-access-xjkgf\") pod \"482a5739-51df-41c5-89b0-8ea2e82cee8a\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.770877 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/310506d0-94fa-4573-a29a-5ddd012b6e64-operator-scripts\") pod \"310506d0-94fa-4573-a29a-5ddd012b6e64\" (UID: \"310506d0-94fa-4573-a29a-5ddd012b6e64\") " Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.770938 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/482a5739-51df-41c5-89b0-8ea2e82cee8a-operator-scripts\") pod \"482a5739-51df-41c5-89b0-8ea2e82cee8a\" (UID: \"482a5739-51df-41c5-89b0-8ea2e82cee8a\") " Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.771522 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482a5739-51df-41c5-89b0-8ea2e82cee8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "482a5739-51df-41c5-89b0-8ea2e82cee8a" (UID: "482a5739-51df-41c5-89b0-8ea2e82cee8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.771967 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310506d0-94fa-4573-a29a-5ddd012b6e64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "310506d0-94fa-4573-a29a-5ddd012b6e64" (UID: "310506d0-94fa-4573-a29a-5ddd012b6e64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.772893 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/310506d0-94fa-4573-a29a-5ddd012b6e64-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.772920 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/482a5739-51df-41c5-89b0-8ea2e82cee8a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.774401 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482a5739-51df-41c5-89b0-8ea2e82cee8a-kube-api-access-xjkgf" (OuterVolumeSpecName: "kube-api-access-xjkgf") pod "482a5739-51df-41c5-89b0-8ea2e82cee8a" (UID: "482a5739-51df-41c5-89b0-8ea2e82cee8a"). InnerVolumeSpecName "kube-api-access-xjkgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21.785653 master-0 kubenswrapper[15202]: I0319 09:50:21.781815 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310506d0-94fa-4573-a29a-5ddd012b6e64-kube-api-access-29s2q" (OuterVolumeSpecName: "kube-api-access-29s2q") pod "310506d0-94fa-4573-a29a-5ddd012b6e64" (UID: "310506d0-94fa-4573-a29a-5ddd012b6e64"). InnerVolumeSpecName "kube-api-access-29s2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21.874196 master-0 kubenswrapper[15202]: I0319 09:50:21.874137 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run\") pod \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " Mar 19 09:50:21.874435 master-0 kubenswrapper[15202]: I0319 09:50:21.874241 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run" (OuterVolumeSpecName: "var-run") pod "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" (UID: "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:50:21.874435 master-0 kubenswrapper[15202]: I0319 09:50:21.874301 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-log-ovn\") pod \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " Mar 19 09:50:21.874435 master-0 kubenswrapper[15202]: I0319 09:50:21.874400 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" (UID: "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:50:21.874562 master-0 kubenswrapper[15202]: I0319 09:50:21.874416 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd7qk\" (UniqueName: \"kubernetes.io/projected/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-kube-api-access-xd7qk\") pod \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " Mar 19 09:50:21.874562 master-0 kubenswrapper[15202]: I0319 09:50:21.874507 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-additional-scripts\") pod \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " Mar 19 09:50:21.874562 master-0 kubenswrapper[15202]: I0319 09:50:21.874534 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c485a6b-17be-4afe-8110-a57f77347be1-operator-scripts\") pod \"3c485a6b-17be-4afe-8110-a57f77347be1\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " Mar 19 09:50:21.874653 master-0 kubenswrapper[15202]: I0319 09:50:21.874562 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-operator-scripts\") pod \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " Mar 19 09:50:21.874701 master-0 kubenswrapper[15202]: I0319 09:50:21.874678 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlkjb\" (UniqueName: \"kubernetes.io/projected/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-kube-api-access-nlkjb\") pod \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\" (UID: \"592e718b-4b77-4ef3-8ee0-e7ce98415e3c\") " Mar 19 09:50:21.874746 master-0 kubenswrapper[15202]: I0319 09:50:21.874724 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r4nz\" (UniqueName: \"kubernetes.io/projected/3c485a6b-17be-4afe-8110-a57f77347be1-kube-api-access-7r4nz\") pod \"3c485a6b-17be-4afe-8110-a57f77347be1\" (UID: \"3c485a6b-17be-4afe-8110-a57f77347be1\") " Mar 19 09:50:21.874822 master-0 kubenswrapper[15202]: I0319 09:50:21.874806 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-scripts\") pod \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " Mar 19 09:50:21.874867 master-0 kubenswrapper[15202]: I0319 09:50:21.874844 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run-ovn\") pod \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\" (UID: \"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f\") " Mar 19 09:50:21.875305 master-0 kubenswrapper[15202]: I0319 09:50:21.875254 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c485a6b-17be-4afe-8110-a57f77347be1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c485a6b-17be-4afe-8110-a57f77347be1" (UID: "3c485a6b-17be-4afe-8110-a57f77347be1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21.875408 master-0 kubenswrapper[15202]: I0319 09:50:21.875322 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" (UID: "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:50:21.875656 master-0 kubenswrapper[15202]: I0319 09:50:21.875315 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" (UID: "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21.875764 master-0 kubenswrapper[15202]: I0319 09:50:21.875300 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29s2q\" (UniqueName: \"kubernetes.io/projected/310506d0-94fa-4573-a29a-5ddd012b6e64-kube-api-access-29s2q\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.875872 master-0 kubenswrapper[15202]: I0319 09:50:21.875859 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjkgf\" (UniqueName: \"kubernetes.io/projected/482a5739-51df-41c5-89b0-8ea2e82cee8a-kube-api-access-xjkgf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.875967 master-0 kubenswrapper[15202]: I0319 09:50:21.875955 15202 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.876064 master-0 kubenswrapper[15202]: I0319 09:50:21.876046 15202 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.876180 master-0 kubenswrapper[15202]: I0319 09:50:21.875863 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "592e718b-4b77-4ef3-8ee0-e7ce98415e3c" (UID: "592e718b-4b77-4ef3-8ee0-e7ce98415e3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21.876273 master-0 kubenswrapper[15202]: I0319 09:50:21.876132 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-scripts" (OuterVolumeSpecName: "scripts") pod "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" (UID: "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21.878314 master-0 kubenswrapper[15202]: I0319 09:50:21.878242 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c485a6b-17be-4afe-8110-a57f77347be1-kube-api-access-7r4nz" (OuterVolumeSpecName: "kube-api-access-7r4nz") pod "3c485a6b-17be-4afe-8110-a57f77347be1" (UID: "3c485a6b-17be-4afe-8110-a57f77347be1"). InnerVolumeSpecName "kube-api-access-7r4nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21.878600 master-0 kubenswrapper[15202]: I0319 09:50:21.878566 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-kube-api-access-xd7qk" (OuterVolumeSpecName: "kube-api-access-xd7qk") pod "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" (UID: "b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f"). InnerVolumeSpecName "kube-api-access-xd7qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21.879695 master-0 kubenswrapper[15202]: I0319 09:50:21.879641 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-kube-api-access-nlkjb" (OuterVolumeSpecName: "kube-api-access-nlkjb") pod "592e718b-4b77-4ef3-8ee0-e7ce98415e3c" (UID: "592e718b-4b77-4ef3-8ee0-e7ce98415e3c"). InnerVolumeSpecName "kube-api-access-nlkjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21.979361 master-0 kubenswrapper[15202]: I0319 09:50:21.979277 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd7qk\" (UniqueName: \"kubernetes.io/projected/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-kube-api-access-xd7qk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979361 master-0 kubenswrapper[15202]: I0319 09:50:21.979357 15202 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979361 master-0 kubenswrapper[15202]: I0319 09:50:21.979375 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c485a6b-17be-4afe-8110-a57f77347be1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979686 master-0 kubenswrapper[15202]: I0319 09:50:21.979394 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979686 master-0 kubenswrapper[15202]: I0319 09:50:21.979412 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlkjb\" (UniqueName: \"kubernetes.io/projected/592e718b-4b77-4ef3-8ee0-e7ce98415e3c-kube-api-access-nlkjb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979686 master-0 kubenswrapper[15202]: I0319 09:50:21.979426 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r4nz\" (UniqueName: \"kubernetes.io/projected/3c485a6b-17be-4afe-8110-a57f77347be1-kube-api-access-7r4nz\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979686 master-0 kubenswrapper[15202]: I0319 09:50:21.979440 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:21.979686 master-0 kubenswrapper[15202]: I0319 09:50:21.979455 15202 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:22.194780 master-0 kubenswrapper[15202]: I0319 09:50:22.194709 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3735-account-create-update-59xbx" event={"ID":"592e718b-4b77-4ef3-8ee0-e7ce98415e3c","Type":"ContainerDied","Data":"b7cc05f29a9cb878a2b1b6791008ce62333cd5d5b2c7f02a65f27c15fea92d4b"} Mar 19 09:50:22.195220 master-0 kubenswrapper[15202]: I0319 09:50:22.195202 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7cc05f29a9cb878a2b1b6791008ce62333cd5d5b2c7f02a65f27c15fea92d4b" Mar 19 09:50:22.195349 master-0 kubenswrapper[15202]: I0319 09:50:22.194796 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3735-account-create-update-59xbx" Mar 19 09:50:22.196590 master-0 kubenswrapper[15202]: I0319 09:50:22.196536 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m68fw-config-tzm6j" event={"ID":"b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f","Type":"ContainerDied","Data":"933b8fec9f1400c97de685736abfaa1fffd92405978f0138aeea6235e7bd3461"} Mar 19 09:50:22.196671 master-0 kubenswrapper[15202]: I0319 09:50:22.196597 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="933b8fec9f1400c97de685736abfaa1fffd92405978f0138aeea6235e7bd3461" Mar 19 09:50:22.196671 master-0 kubenswrapper[15202]: I0319 09:50:22.196548 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m68fw-config-tzm6j" Mar 19 09:50:22.198519 master-0 kubenswrapper[15202]: I0319 09:50:22.198455 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8nppp" Mar 19 09:50:22.198734 master-0 kubenswrapper[15202]: I0319 09:50:22.198409 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8nppp" event={"ID":"482a5739-51df-41c5-89b0-8ea2e82cee8a","Type":"ContainerDied","Data":"863a2230400dad99fe75bdc361d00d8e83a9df5751e91711191684b15cfb07a0"} Mar 19 09:50:22.198801 master-0 kubenswrapper[15202]: I0319 09:50:22.198740 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="863a2230400dad99fe75bdc361d00d8e83a9df5751e91711191684b15cfb07a0" Mar 19 09:50:22.202311 master-0 kubenswrapper[15202]: I0319 09:50:22.200053 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wjhhn" event={"ID":"310506d0-94fa-4573-a29a-5ddd012b6e64","Type":"ContainerDied","Data":"62b0104bd9a893afbfa9ca94eecace9ca77ca6f2ed4bf3c250908cd7a6955d3e"} Mar 19 09:50:22.202311 master-0 kubenswrapper[15202]: I0319 09:50:22.200093 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b0104bd9a893afbfa9ca94eecace9ca77ca6f2ed4bf3c250908cd7a6955d3e" Mar 19 09:50:22.202311 master-0 kubenswrapper[15202]: I0319 09:50:22.200090 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wjhhn" Mar 19 09:50:22.202311 master-0 kubenswrapper[15202]: I0319 09:50:22.201530 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vk8gz" event={"ID":"9fefe00c-9546-4205-a4e5-a73e807d6bf4","Type":"ContainerStarted","Data":"1889dd316ddb14048830e434954b638b78e3749d2615c2a488f5b1ea38ff640c"} Mar 19 09:50:22.204837 master-0 kubenswrapper[15202]: I0319 09:50:22.204788 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-886c-account-create-update-24ntn" event={"ID":"3c485a6b-17be-4afe-8110-a57f77347be1","Type":"ContainerDied","Data":"17bdab900e1443663225c0d1cabbb01cc64346b3f1386ab88236cc2c2e667a5e"} Mar 19 09:50:22.204925 master-0 kubenswrapper[15202]: I0319 09:50:22.204846 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17bdab900e1443663225c0d1cabbb01cc64346b3f1386ab88236cc2c2e667a5e" Mar 19 09:50:22.204925 master-0 kubenswrapper[15202]: I0319 09:50:22.204891 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-886c-account-create-update-24ntn" Mar 19 09:50:22.242176 master-0 kubenswrapper[15202]: I0319 09:50:22.242111 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vk8gz" podStartSLOduration=2.152343746 podStartE2EDuration="7.242092395s" podCreationTimestamp="2026-03-19 09:50:15 +0000 UTC" firstStartedPulling="2026-03-19 09:50:16.36222637 +0000 UTC m=+1533.747641186" lastFinishedPulling="2026-03-19 09:50:21.451975019 +0000 UTC m=+1538.837389835" observedRunningTime="2026-03-19 09:50:22.227788402 +0000 UTC m=+1539.613203208" watchObservedRunningTime="2026-03-19 09:50:22.242092395 +0000 UTC m=+1539.627507211" Mar 19 09:50:22.694839 master-0 kubenswrapper[15202]: I0319 09:50:22.694788 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:50:22.806305 master-0 kubenswrapper[15202]: I0319 09:50:22.799550 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-vtxcj"] Mar 19 09:50:22.806305 master-0 kubenswrapper[15202]: I0319 09:50:22.799890 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" containerName="dnsmasq-dns" containerID="cri-o://ec2e6f23bffac906678d7bfaaa7e31b24920ce9996b1ce96040da8018a7a078d" gracePeriod=10 Mar 19 09:50:22.926324 master-0 kubenswrapper[15202]: I0319 09:50:22.926248 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m68fw-config-tzm6j"] Mar 19 09:50:22.954533 master-0 kubenswrapper[15202]: I0319 09:50:22.944990 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m68fw-config-tzm6j"] Mar 19 09:50:23.239674 master-0 kubenswrapper[15202]: I0319 09:50:23.239493 15202 generic.go:334] "Generic (PLEG): container finished" podID="c9bce886-0af6-432e-ab68-b4af30a4defd" containerID="c1b707bc875ad8212e30e4e8f547814d3aaefa608648a913c592528a62709843" exitCode=0 Mar 19 09:50:23.239877 master-0 kubenswrapper[15202]: I0319 09:50:23.239679 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zxw2c" event={"ID":"c9bce886-0af6-432e-ab68-b4af30a4defd","Type":"ContainerDied","Data":"c1b707bc875ad8212e30e4e8f547814d3aaefa608648a913c592528a62709843"} Mar 19 09:50:23.244523 master-0 kubenswrapper[15202]: I0319 09:50:23.244381 15202 generic.go:334] "Generic (PLEG): container finished" podID="6345bc32-ed02-4534-830a-229d7f9e4975" containerID="ec2e6f23bffac906678d7bfaaa7e31b24920ce9996b1ce96040da8018a7a078d" exitCode=0 Mar 19 09:50:23.244682 master-0 kubenswrapper[15202]: I0319 09:50:23.244482 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" event={"ID":"6345bc32-ed02-4534-830a-229d7f9e4975","Type":"ContainerDied","Data":"ec2e6f23bffac906678d7bfaaa7e31b24920ce9996b1ce96040da8018a7a078d"} Mar 19 09:50:23.439690 master-0 kubenswrapper[15202]: I0319 09:50:23.438594 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:50:23.474228 master-0 kubenswrapper[15202]: I0319 09:50:23.474178 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc\") pod \"6345bc32-ed02-4534-830a-229d7f9e4975\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " Mar 19 09:50:23.474228 master-0 kubenswrapper[15202]: I0319 09:50:23.474222 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-nb\") pod \"6345bc32-ed02-4534-830a-229d7f9e4975\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " Mar 19 09:50:23.474499 master-0 kubenswrapper[15202]: I0319 09:50:23.474252 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-config\") pod \"6345bc32-ed02-4534-830a-229d7f9e4975\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " Mar 19 09:50:23.474499 master-0 kubenswrapper[15202]: I0319 09:50:23.474285 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-sb\") pod \"6345bc32-ed02-4534-830a-229d7f9e4975\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " Mar 19 09:50:23.474499 master-0 kubenswrapper[15202]: I0319 09:50:23.474328 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbxkx\" (UniqueName: \"kubernetes.io/projected/6345bc32-ed02-4534-830a-229d7f9e4975-kube-api-access-mbxkx\") pod \"6345bc32-ed02-4534-830a-229d7f9e4975\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " Mar 19 09:50:23.478621 master-0 kubenswrapper[15202]: I0319 09:50:23.478528 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6345bc32-ed02-4534-830a-229d7f9e4975-kube-api-access-mbxkx" (OuterVolumeSpecName: "kube-api-access-mbxkx") pod "6345bc32-ed02-4534-830a-229d7f9e4975" (UID: "6345bc32-ed02-4534-830a-229d7f9e4975"). InnerVolumeSpecName "kube-api-access-mbxkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:23.530318 master-0 kubenswrapper[15202]: I0319 09:50:23.530214 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6345bc32-ed02-4534-830a-229d7f9e4975" (UID: "6345bc32-ed02-4534-830a-229d7f9e4975"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:23.550345 master-0 kubenswrapper[15202]: I0319 09:50:23.550279 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-config" (OuterVolumeSpecName: "config") pod "6345bc32-ed02-4534-830a-229d7f9e4975" (UID: "6345bc32-ed02-4534-830a-229d7f9e4975"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:23.570702 master-0 kubenswrapper[15202]: E0319 09:50:23.570629 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc podName:6345bc32-ed02-4534-830a-229d7f9e4975 nodeName:}" failed. No retries permitted until 2026-03-19 09:50:24.070591382 +0000 UTC m=+1541.456006198 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc") pod "6345bc32-ed02-4534-830a-229d7f9e4975" (UID: "6345bc32-ed02-4534-830a-229d7f9e4975") : error deleting /var/lib/kubelet/pods/6345bc32-ed02-4534-830a-229d7f9e4975/volume-subpaths: remove /var/lib/kubelet/pods/6345bc32-ed02-4534-830a-229d7f9e4975/volume-subpaths: no such file or directory Mar 19 09:50:23.571010 master-0 kubenswrapper[15202]: I0319 09:50:23.570972 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6345bc32-ed02-4534-830a-229d7f9e4975" (UID: "6345bc32-ed02-4534-830a-229d7f9e4975"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:23.580358 master-0 kubenswrapper[15202]: I0319 09:50:23.580297 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:23.580358 master-0 kubenswrapper[15202]: I0319 09:50:23.580350 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:23.580462 master-0 kubenswrapper[15202]: I0319 09:50:23.580361 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:23.580462 master-0 kubenswrapper[15202]: I0319 09:50:23.580377 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbxkx\" (UniqueName: \"kubernetes.io/projected/6345bc32-ed02-4534-830a-229d7f9e4975-kube-api-access-mbxkx\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:24.089402 master-0 kubenswrapper[15202]: I0319 09:50:24.089324 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc\") pod \"6345bc32-ed02-4534-830a-229d7f9e4975\" (UID: \"6345bc32-ed02-4534-830a-229d7f9e4975\") " Mar 19 09:50:24.090005 master-0 kubenswrapper[15202]: I0319 09:50:24.089869 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6345bc32-ed02-4534-830a-229d7f9e4975" (UID: "6345bc32-ed02-4534-830a-229d7f9e4975"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:24.090288 master-0 kubenswrapper[15202]: I0319 09:50:24.090255 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6345bc32-ed02-4534-830a-229d7f9e4975-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:24.257312 master-0 kubenswrapper[15202]: I0319 09:50:24.257233 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" event={"ID":"6345bc32-ed02-4534-830a-229d7f9e4975","Type":"ContainerDied","Data":"40461aab4a632ecc52ef5b4fb81785895cc871a798e5c330b78b6be17068c31a"} Mar 19 09:50:24.257312 master-0 kubenswrapper[15202]: I0319 09:50:24.257315 15202 scope.go:117] "RemoveContainer" containerID="ec2e6f23bffac906678d7bfaaa7e31b24920ce9996b1ce96040da8018a7a078d" Mar 19 09:50:24.257610 master-0 kubenswrapper[15202]: I0319 09:50:24.257260 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-vtxcj" Mar 19 09:50:24.280933 master-0 kubenswrapper[15202]: I0319 09:50:24.280874 15202 scope.go:117] "RemoveContainer" containerID="f879d65e6b05b29ec0ecdb7a3aef03b0c5ba8763137a1b1c2ef6ee3fa4086b25" Mar 19 09:50:24.324603 master-0 kubenswrapper[15202]: I0319 09:50:24.320397 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-vtxcj"] Mar 19 09:50:24.338768 master-0 kubenswrapper[15202]: I0319 09:50:24.338634 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-vtxcj"] Mar 19 09:50:24.790435 master-0 kubenswrapper[15202]: I0319 09:50:24.790257 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zxw2c" Mar 19 09:50:24.807601 master-0 kubenswrapper[15202]: I0319 09:50:24.807507 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-config-data\") pod \"c9bce886-0af6-432e-ab68-b4af30a4defd\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " Mar 19 09:50:24.807850 master-0 kubenswrapper[15202]: I0319 09:50:24.807743 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-combined-ca-bundle\") pod \"c9bce886-0af6-432e-ab68-b4af30a4defd\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " Mar 19 09:50:24.807901 master-0 kubenswrapper[15202]: I0319 09:50:24.807846 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pbvc\" (UniqueName: \"kubernetes.io/projected/c9bce886-0af6-432e-ab68-b4af30a4defd-kube-api-access-9pbvc\") pod \"c9bce886-0af6-432e-ab68-b4af30a4defd\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " Mar 19 09:50:24.808040 master-0 kubenswrapper[15202]: I0319 09:50:24.807968 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-db-sync-config-data\") pod \"c9bce886-0af6-432e-ab68-b4af30a4defd\" (UID: \"c9bce886-0af6-432e-ab68-b4af30a4defd\") " Mar 19 09:50:24.812775 master-0 kubenswrapper[15202]: I0319 09:50:24.812709 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c9bce886-0af6-432e-ab68-b4af30a4defd" (UID: "c9bce886-0af6-432e-ab68-b4af30a4defd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24.829643 master-0 kubenswrapper[15202]: I0319 09:50:24.828203 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" path="/var/lib/kubelet/pods/6345bc32-ed02-4534-830a-229d7f9e4975/volumes" Mar 19 09:50:24.829643 master-0 kubenswrapper[15202]: I0319 09:50:24.829064 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" path="/var/lib/kubelet/pods/b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f/volumes" Mar 19 09:50:24.836655 master-0 kubenswrapper[15202]: I0319 09:50:24.836543 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bce886-0af6-432e-ab68-b4af30a4defd-kube-api-access-9pbvc" (OuterVolumeSpecName: "kube-api-access-9pbvc") pod "c9bce886-0af6-432e-ab68-b4af30a4defd" (UID: "c9bce886-0af6-432e-ab68-b4af30a4defd"). InnerVolumeSpecName "kube-api-access-9pbvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:24.874506 master-0 kubenswrapper[15202]: I0319 09:50:24.870652 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9bce886-0af6-432e-ab68-b4af30a4defd" (UID: "c9bce886-0af6-432e-ab68-b4af30a4defd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24.925119 master-0 kubenswrapper[15202]: I0319 09:50:24.917948 15202 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:24.925119 master-0 kubenswrapper[15202]: I0319 09:50:24.918000 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:24.925119 master-0 kubenswrapper[15202]: I0319 09:50:24.918014 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pbvc\" (UniqueName: \"kubernetes.io/projected/c9bce886-0af6-432e-ab68-b4af30a4defd-kube-api-access-9pbvc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:24.931126 master-0 kubenswrapper[15202]: I0319 09:50:24.930126 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-config-data" (OuterVolumeSpecName: "config-data") pod "c9bce886-0af6-432e-ab68-b4af30a4defd" (UID: "c9bce886-0af6-432e-ab68-b4af30a4defd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963145 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86659cf465-r6c25"] Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963668 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" containerName="ovn-config" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963682 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" containerName="ovn-config" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963702 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9bce886-0af6-432e-ab68-b4af30a4defd" containerName="glance-db-sync" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963709 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bce886-0af6-432e-ab68-b4af30a4defd" containerName="glance-db-sync" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963732 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482a5739-51df-41c5-89b0-8ea2e82cee8a" containerName="mariadb-database-create" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963738 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="482a5739-51df-41c5-89b0-8ea2e82cee8a" containerName="mariadb-database-create" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963755 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592e718b-4b77-4ef3-8ee0-e7ce98415e3c" containerName="mariadb-account-create-update" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963762 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="592e718b-4b77-4ef3-8ee0-e7ce98415e3c" containerName="mariadb-account-create-update" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963785 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" containerName="dnsmasq-dns" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963791 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" containerName="dnsmasq-dns" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963803 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310506d0-94fa-4573-a29a-5ddd012b6e64" containerName="mariadb-database-create" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963813 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="310506d0-94fa-4573-a29a-5ddd012b6e64" containerName="mariadb-database-create" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963834 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" containerName="init" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963842 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" containerName="init" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: E0319 09:50:24.963859 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c485a6b-17be-4afe-8110-a57f77347be1" containerName="mariadb-account-create-update" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.963866 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c485a6b-17be-4afe-8110-a57f77347be1" containerName="mariadb-account-create-update" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964131 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="482a5739-51df-41c5-89b0-8ea2e82cee8a" containerName="mariadb-database-create" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964150 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9bce886-0af6-432e-ab68-b4af30a4defd" containerName="glance-db-sync" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964160 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c9f92b-f7c6-41b1-8260-f7f33c16fc1f" containerName="ovn-config" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964172 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="6345bc32-ed02-4534-830a-229d7f9e4975" containerName="dnsmasq-dns" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964190 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c485a6b-17be-4afe-8110-a57f77347be1" containerName="mariadb-account-create-update" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964209 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="592e718b-4b77-4ef3-8ee0-e7ce98415e3c" containerName="mariadb-account-create-update" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.964219 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="310506d0-94fa-4573-a29a-5ddd012b6e64" containerName="mariadb-database-create" Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.969416 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86659cf465-r6c25"] Mar 19 09:50:24.970562 master-0 kubenswrapper[15202]: I0319 09:50:24.969610 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:24.982737 master-0 kubenswrapper[15202]: I0319 09:50:24.976737 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-a" Mar 19 09:50:25.019809 master-0 kubenswrapper[15202]: I0319 09:50:25.019750 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-edpm-a\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.020387 master-0 kubenswrapper[15202]: I0319 09:50:25.020366 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-config\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.020586 master-0 kubenswrapper[15202]: I0319 09:50:25.020568 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9k2h\" (UniqueName: \"kubernetes.io/projected/b3def07a-70e3-4a58-b0ce-60a0b208548f-kube-api-access-j9k2h\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.020735 master-0 kubenswrapper[15202]: I0319 09:50:25.020720 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-nb\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.020920 master-0 kubenswrapper[15202]: I0319 09:50:25.020905 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-sb\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.021034 master-0 kubenswrapper[15202]: I0319 09:50:25.021020 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-svc\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.021129 master-0 kubenswrapper[15202]: I0319 09:50:25.021116 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-swift-storage-0\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.021257 master-0 kubenswrapper[15202]: I0319 09:50:25.021242 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bce886-0af6-432e-ab68-b4af30a4defd-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:25.122942 master-0 kubenswrapper[15202]: I0319 09:50:25.122857 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-svc\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.122942 master-0 kubenswrapper[15202]: I0319 09:50:25.122937 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-swift-storage-0\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.123625 master-0 kubenswrapper[15202]: I0319 09:50:25.122968 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-edpm-a\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.123625 master-0 kubenswrapper[15202]: I0319 09:50:25.123033 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-config\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.123625 master-0 kubenswrapper[15202]: I0319 09:50:25.123072 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9k2h\" (UniqueName: \"kubernetes.io/projected/b3def07a-70e3-4a58-b0ce-60a0b208548f-kube-api-access-j9k2h\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.123625 master-0 kubenswrapper[15202]: I0319 09:50:25.123107 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-nb\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.123625 master-0 kubenswrapper[15202]: I0319 09:50:25.123144 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-sb\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.125839 master-0 kubenswrapper[15202]: I0319 09:50:25.124018 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-sb\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.125839 master-0 kubenswrapper[15202]: I0319 09:50:25.124615 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-edpm-a\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.125839 master-0 kubenswrapper[15202]: I0319 09:50:25.124626 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-config\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.125839 master-0 kubenswrapper[15202]: I0319 09:50:25.125276 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-svc\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.125839 master-0 kubenswrapper[15202]: I0319 09:50:25.125519 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-swift-storage-0\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.126315 master-0 kubenswrapper[15202]: I0319 09:50:25.126281 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-nb\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.153998 master-0 kubenswrapper[15202]: I0319 09:50:25.153929 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9k2h\" (UniqueName: \"kubernetes.io/projected/b3def07a-70e3-4a58-b0ce-60a0b208548f-kube-api-access-j9k2h\") pod \"dnsmasq-dns-86659cf465-r6c25\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.271502 master-0 kubenswrapper[15202]: I0319 09:50:25.271376 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zxw2c" event={"ID":"c9bce886-0af6-432e-ab68-b4af30a4defd","Type":"ContainerDied","Data":"d94ffaca0ede5ff4e2b1d05f0bc3779f014ad819344d687838eb6f2a70e75027"} Mar 19 09:50:25.271772 master-0 kubenswrapper[15202]: I0319 09:50:25.271540 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zxw2c" Mar 19 09:50:25.271867 master-0 kubenswrapper[15202]: I0319 09:50:25.271717 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d94ffaca0ede5ff4e2b1d05f0bc3779f014ad819344d687838eb6f2a70e75027" Mar 19 09:50:25.291434 master-0 kubenswrapper[15202]: I0319 09:50:25.291373 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:25.745269 master-0 kubenswrapper[15202]: I0319 09:50:25.745214 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86659cf465-r6c25"] Mar 19 09:50:25.902750 master-0 kubenswrapper[15202]: I0319 09:50:25.896020 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59697cf549-dzw8p"] Mar 19 09:50:25.902750 master-0 kubenswrapper[15202]: I0319 09:50:25.899389 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.961776 master-0 kubenswrapper[15202]: I0319 09:50:25.961278 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86659cf465-r6c25"] Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980221 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-config\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980298 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-nb\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980370 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-sb\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980444 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgm7\" (UniqueName: \"kubernetes.io/projected/16865636-fcf3-49a3-bd27-dde9ebbd3549-kube-api-access-fxgm7\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980645 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-svc\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980682 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-swift-storage-0\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.981491 master-0 kubenswrapper[15202]: I0319 09:50:25.980704 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-edpm-a\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:25.985435 master-0 kubenswrapper[15202]: I0319 09:50:25.985388 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59697cf549-dzw8p"] Mar 19 09:50:26.085541 master-0 kubenswrapper[15202]: I0319 09:50:26.084153 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-config\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.085541 master-0 kubenswrapper[15202]: I0319 09:50:26.084197 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-nb\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.085541 master-0 kubenswrapper[15202]: I0319 09:50:26.084234 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-sb\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.085541 master-0 kubenswrapper[15202]: I0319 09:50:26.084264 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgm7\" (UniqueName: \"kubernetes.io/projected/16865636-fcf3-49a3-bd27-dde9ebbd3549-kube-api-access-fxgm7\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.085903 master-0 kubenswrapper[15202]: I0319 09:50:26.085826 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-svc\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.085903 master-0 kubenswrapper[15202]: I0319 09:50:26.085856 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-swift-storage-0\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.085903 master-0 kubenswrapper[15202]: I0319 09:50:26.085871 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-edpm-a\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.087299 master-0 kubenswrapper[15202]: I0319 09:50:26.086878 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-config\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.087299 master-0 kubenswrapper[15202]: I0319 09:50:26.087176 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-sb\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.087299 master-0 kubenswrapper[15202]: I0319 09:50:26.087191 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-nb\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.087922 master-0 kubenswrapper[15202]: I0319 09:50:26.087546 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-svc\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.089063 master-0 kubenswrapper[15202]: I0319 09:50:26.089037 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-swift-storage-0\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.089942 master-0 kubenswrapper[15202]: I0319 09:50:26.089794 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-edpm-a\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.175596 master-0 kubenswrapper[15202]: I0319 09:50:26.172949 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgm7\" (UniqueName: \"kubernetes.io/projected/16865636-fcf3-49a3-bd27-dde9ebbd3549-kube-api-access-fxgm7\") pod \"dnsmasq-dns-59697cf549-dzw8p\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.191029 master-0 kubenswrapper[15202]: I0319 09:50:26.190677 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59697cf549-dzw8p"] Mar 19 09:50:26.199670 master-0 kubenswrapper[15202]: I0319 09:50:26.192603 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:26.267791 master-0 kubenswrapper[15202]: I0319 09:50:26.267104 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f88f897-5c5kd"] Mar 19 09:50:26.269358 master-0 kubenswrapper[15202]: I0319 09:50:26.269316 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.292796 master-0 kubenswrapper[15202]: I0319 09:50:26.286090 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f88f897-5c5kd"] Mar 19 09:50:26.306269 master-0 kubenswrapper[15202]: I0319 09:50:26.306055 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-swift-storage-0\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.306907 master-0 kubenswrapper[15202]: I0319 09:50:26.306863 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/98cda58d-8e34-4bcf-8169-179fa5f470cc-kube-api-access-mppjl\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.306985 master-0 kubenswrapper[15202]: I0319 09:50:26.306951 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-config\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.307074 master-0 kubenswrapper[15202]: I0319 09:50:26.307049 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-sb\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.307134 master-0 kubenswrapper[15202]: I0319 09:50:26.307082 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-nb\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.307282 master-0 kubenswrapper[15202]: I0319 09:50:26.307260 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-edpm-a\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.307404 master-0 kubenswrapper[15202]: I0319 09:50:26.307381 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-svc\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.325149 master-0 kubenswrapper[15202]: I0319 09:50:26.325033 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86659cf465-r6c25" event={"ID":"b3def07a-70e3-4a58-b0ce-60a0b208548f","Type":"ContainerStarted","Data":"01edb1348d344e8f3c2bb5314f677084da5a8b165c9b8b4864eea730440b6b3a"} Mar 19 09:50:26.413888 master-0 kubenswrapper[15202]: I0319 09:50:26.413813 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/98cda58d-8e34-4bcf-8169-179fa5f470cc-kube-api-access-mppjl\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.414037 master-0 kubenswrapper[15202]: I0319 09:50:26.413906 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-config\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.414037 master-0 kubenswrapper[15202]: I0319 09:50:26.413955 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-sb\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.415070 master-0 kubenswrapper[15202]: I0319 09:50:26.415031 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-config\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.416601 master-0 kubenswrapper[15202]: I0319 09:50:26.415838 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-sb\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.430666 master-0 kubenswrapper[15202]: I0319 09:50:26.430217 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-nb\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.430666 master-0 kubenswrapper[15202]: I0319 09:50:26.430434 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-edpm-a\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.430780 master-0 kubenswrapper[15202]: I0319 09:50:26.430698 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-svc\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.432258 master-0 kubenswrapper[15202]: I0319 09:50:26.431829 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-edpm-a\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.436103 master-0 kubenswrapper[15202]: I0319 09:50:26.435042 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-swift-storage-0\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.436891 master-0 kubenswrapper[15202]: I0319 09:50:26.436523 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-svc\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.437069 master-0 kubenswrapper[15202]: I0319 09:50:26.437034 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-nb\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.439045 master-0 kubenswrapper[15202]: I0319 09:50:26.437837 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-swift-storage-0\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.445096 master-0 kubenswrapper[15202]: I0319 09:50:26.445004 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/98cda58d-8e34-4bcf-8169-179fa5f470cc-kube-api-access-mppjl\") pod \"dnsmasq-dns-85f88f897-5c5kd\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.619683 master-0 kubenswrapper[15202]: I0319 09:50:26.619246 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5"] Mar 19 09:50:26.625347 master-0 kubenswrapper[15202]: I0319 09:50:26.624612 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.628770 master-0 kubenswrapper[15202]: I0319 09:50:26.628740 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-a-provisionserver-httpd-config" Mar 19 09:50:26.629539 master-0 kubenswrapper[15202]: I0319 09:50:26.628931 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:26.749893 master-0 kubenswrapper[15202]: I0319 09:50:26.749819 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/9bf992bb-2aac-49c3-8135-6ab9f3a53193-image-data\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.754830 master-0 kubenswrapper[15202]: I0319 09:50:26.754770 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdzzf\" (UniqueName: \"kubernetes.io/projected/9bf992bb-2aac-49c3-8135-6ab9f3a53193-kube-api-access-zdzzf\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.755006 master-0 kubenswrapper[15202]: I0319 09:50:26.754978 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/9bf992bb-2aac-49c3-8135-6ab9f3a53193-httpd-config\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.786947 master-0 kubenswrapper[15202]: I0319 09:50:26.786735 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59697cf549-dzw8p"] Mar 19 09:50:26.863231 master-0 kubenswrapper[15202]: I0319 09:50:26.861991 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/9bf992bb-2aac-49c3-8135-6ab9f3a53193-image-data\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.863231 master-0 kubenswrapper[15202]: I0319 09:50:26.862137 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzzf\" (UniqueName: \"kubernetes.io/projected/9bf992bb-2aac-49c3-8135-6ab9f3a53193-kube-api-access-zdzzf\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.863231 master-0 kubenswrapper[15202]: I0319 09:50:26.862188 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/9bf992bb-2aac-49c3-8135-6ab9f3a53193-httpd-config\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.863231 master-0 kubenswrapper[15202]: I0319 09:50:26.862685 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/9bf992bb-2aac-49c3-8135-6ab9f3a53193-image-data\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.864499 master-0 kubenswrapper[15202]: I0319 09:50:26.864456 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/9bf992bb-2aac-49c3-8135-6ab9f3a53193-httpd-config\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.888492 master-0 kubenswrapper[15202]: I0319 09:50:26.888359 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzzf\" (UniqueName: \"kubernetes.io/projected/9bf992bb-2aac-49c3-8135-6ab9f3a53193-kube-api-access-zdzzf\") pod \"edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5\" (UID: \"9bf992bb-2aac-49c3-8135-6ab9f3a53193\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:26.978558 master-0 kubenswrapper[15202]: I0319 09:50:26.978505 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:50:27.188793 master-0 kubenswrapper[15202]: I0319 09:50:27.188737 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f88f897-5c5kd"] Mar 19 09:50:27.198183 master-0 kubenswrapper[15202]: W0319 09:50:27.197936 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98cda58d_8e34_4bcf_8169_179fa5f470cc.slice/crio-80d3bdead8473a22af41f622af4b4132416e3514989479a0f66199372aee5b35 WatchSource:0}: Error finding container 80d3bdead8473a22af41f622af4b4132416e3514989479a0f66199372aee5b35: Status 404 returned error can't find the container with id 80d3bdead8473a22af41f622af4b4132416e3514989479a0f66199372aee5b35 Mar 19 09:50:27.340738 master-0 kubenswrapper[15202]: I0319 09:50:27.340682 15202 generic.go:334] "Generic (PLEG): container finished" podID="9fefe00c-9546-4205-a4e5-a73e807d6bf4" containerID="1889dd316ddb14048830e434954b638b78e3749d2615c2a488f5b1ea38ff640c" exitCode=0 Mar 19 09:50:27.340875 master-0 kubenswrapper[15202]: I0319 09:50:27.340779 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vk8gz" event={"ID":"9fefe00c-9546-4205-a4e5-a73e807d6bf4","Type":"ContainerDied","Data":"1889dd316ddb14048830e434954b638b78e3749d2615c2a488f5b1ea38ff640c"} Mar 19 09:50:27.343159 master-0 kubenswrapper[15202]: I0319 09:50:27.343038 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" event={"ID":"98cda58d-8e34-4bcf-8169-179fa5f470cc","Type":"ContainerStarted","Data":"80d3bdead8473a22af41f622af4b4132416e3514989479a0f66199372aee5b35"} Mar 19 09:50:27.345759 master-0 kubenswrapper[15202]: I0319 09:50:27.345721 15202 generic.go:334] "Generic (PLEG): container finished" podID="b3def07a-70e3-4a58-b0ce-60a0b208548f" containerID="446e11c09bc8dbc51e652ef97230f39d544148136a0ceeec266e128247b5f5db" exitCode=0 Mar 19 09:50:27.345830 master-0 kubenswrapper[15202]: I0319 09:50:27.345787 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86659cf465-r6c25" event={"ID":"b3def07a-70e3-4a58-b0ce-60a0b208548f","Type":"ContainerDied","Data":"446e11c09bc8dbc51e652ef97230f39d544148136a0ceeec266e128247b5f5db"} Mar 19 09:50:27.352510 master-0 kubenswrapper[15202]: I0319 09:50:27.352219 15202 generic.go:334] "Generic (PLEG): container finished" podID="16865636-fcf3-49a3-bd27-dde9ebbd3549" containerID="d0a480e1553e1d5f62125757f8487f793e20dfb8099d9560721ae79d4f6927f9" exitCode=0 Mar 19 09:50:27.352510 master-0 kubenswrapper[15202]: I0319 09:50:27.352294 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" event={"ID":"16865636-fcf3-49a3-bd27-dde9ebbd3549","Type":"ContainerDied","Data":"d0a480e1553e1d5f62125757f8487f793e20dfb8099d9560721ae79d4f6927f9"} Mar 19 09:50:27.352510 master-0 kubenswrapper[15202]: I0319 09:50:27.352316 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" event={"ID":"16865636-fcf3-49a3-bd27-dde9ebbd3549","Type":"ContainerStarted","Data":"e3119a5aacfb3a67588e7519734fe5fc70eb2df2f218f798711ef71ed8a7264a"} Mar 19 09:50:27.360646 master-0 kubenswrapper[15202]: I0319 09:50:27.360152 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" event={"ID":"9bf992bb-2aac-49c3-8135-6ab9f3a53193","Type":"ContainerStarted","Data":"6144c3e349ea95e1b6268d88ff183de1e4e2aaf45df93ebc0e2d842313a0ab17"} Mar 19 09:50:28.045438 master-0 kubenswrapper[15202]: I0319 09:50:28.045383 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:28.053865 master-0 kubenswrapper[15202]: I0319 09:50:28.053831 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:28.202196 master-0 kubenswrapper[15202]: I0319 09:50:28.202140 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-svc\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.203020 master-0 kubenswrapper[15202]: I0319 09:50:28.203001 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-edpm-a\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.203205 master-0 kubenswrapper[15202]: I0319 09:50:28.203190 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-swift-storage-0\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.203331 master-0 kubenswrapper[15202]: I0319 09:50:28.203317 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-swift-storage-0\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.203776 master-0 kubenswrapper[15202]: I0319 09:50:28.203756 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-nb\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.203909 master-0 kubenswrapper[15202]: I0319 09:50:28.203895 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9k2h\" (UniqueName: \"kubernetes.io/projected/b3def07a-70e3-4a58-b0ce-60a0b208548f-kube-api-access-j9k2h\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.204079 master-0 kubenswrapper[15202]: I0319 09:50:28.204055 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-edpm-a\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.204590 master-0 kubenswrapper[15202]: I0319 09:50:28.204574 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-config\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.204750 master-0 kubenswrapper[15202]: I0319 09:50:28.204737 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-sb\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.204868 master-0 kubenswrapper[15202]: I0319 09:50:28.204856 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-sb\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.204947 master-0 kubenswrapper[15202]: I0319 09:50:28.204935 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-svc\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.205052 master-0 kubenswrapper[15202]: I0319 09:50:28.205032 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-config\") pod \"b3def07a-70e3-4a58-b0ce-60a0b208548f\" (UID: \"b3def07a-70e3-4a58-b0ce-60a0b208548f\") " Mar 19 09:50:28.205147 master-0 kubenswrapper[15202]: I0319 09:50:28.205133 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-nb\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.205264 master-0 kubenswrapper[15202]: I0319 09:50:28.205250 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxgm7\" (UniqueName: \"kubernetes.io/projected/16865636-fcf3-49a3-bd27-dde9ebbd3549-kube-api-access-fxgm7\") pod \"16865636-fcf3-49a3-bd27-dde9ebbd3549\" (UID: \"16865636-fcf3-49a3-bd27-dde9ebbd3549\") " Mar 19 09:50:28.208279 master-0 kubenswrapper[15202]: I0319 09:50:28.208207 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3def07a-70e3-4a58-b0ce-60a0b208548f-kube-api-access-j9k2h" (OuterVolumeSpecName: "kube-api-access-j9k2h") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "kube-api-access-j9k2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:28.222744 master-0 kubenswrapper[15202]: I0319 09:50:28.222656 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16865636-fcf3-49a3-bd27-dde9ebbd3549-kube-api-access-fxgm7" (OuterVolumeSpecName: "kube-api-access-fxgm7") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "kube-api-access-fxgm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:28.236174 master-0 kubenswrapper[15202]: I0319 09:50:28.236108 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.249660 master-0 kubenswrapper[15202]: I0319 09:50:28.249399 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-config" (OuterVolumeSpecName: "config") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.253941 master-0 kubenswrapper[15202]: I0319 09:50:28.253857 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.258447 master-0 kubenswrapper[15202]: I0319 09:50:28.258302 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.258447 master-0 kubenswrapper[15202]: I0319 09:50:28.258379 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.262780 master-0 kubenswrapper[15202]: I0319 09:50:28.262714 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-config" (OuterVolumeSpecName: "config") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.265585 master-0 kubenswrapper[15202]: I0319 09:50:28.265537 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.268551 master-0 kubenswrapper[15202]: I0319 09:50:28.268485 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.278155 master-0 kubenswrapper[15202]: I0319 09:50:28.277932 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b3def07a-70e3-4a58-b0ce-60a0b208548f" (UID: "b3def07a-70e3-4a58-b0ce-60a0b208548f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.285990 master-0 kubenswrapper[15202]: I0319 09:50:28.285923 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.296655 master-0 kubenswrapper[15202]: I0319 09:50:28.296597 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.308145 master-0 kubenswrapper[15202]: I0319 09:50:28.308062 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16865636-fcf3-49a3-bd27-dde9ebbd3549" (UID: "16865636-fcf3-49a3-bd27-dde9ebbd3549"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308751 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308786 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308799 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308810 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308819 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308830 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308839 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.308836 master-0 kubenswrapper[15202]: I0319 09:50:28.308848 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxgm7\" (UniqueName: \"kubernetes.io/projected/16865636-fcf3-49a3-bd27-dde9ebbd3549-kube-api-access-fxgm7\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.309599 master-0 kubenswrapper[15202]: I0319 09:50:28.308859 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.309599 master-0 kubenswrapper[15202]: I0319 09:50:28.308868 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.309599 master-0 kubenswrapper[15202]: I0319 09:50:28.308877 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16865636-fcf3-49a3-bd27-dde9ebbd3549-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.309599 master-0 kubenswrapper[15202]: I0319 09:50:28.308885 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.309599 master-0 kubenswrapper[15202]: I0319 09:50:28.308894 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3def07a-70e3-4a58-b0ce-60a0b208548f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.309599 master-0 kubenswrapper[15202]: I0319 09:50:28.308902 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9k2h\" (UniqueName: \"kubernetes.io/projected/b3def07a-70e3-4a58-b0ce-60a0b208548f-kube-api-access-j9k2h\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:28.382218 master-0 kubenswrapper[15202]: I0319 09:50:28.380023 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" event={"ID":"16865636-fcf3-49a3-bd27-dde9ebbd3549","Type":"ContainerDied","Data":"e3119a5aacfb3a67588e7519734fe5fc70eb2df2f218f798711ef71ed8a7264a"} Mar 19 09:50:28.382218 master-0 kubenswrapper[15202]: I0319 09:50:28.380101 15202 scope.go:117] "RemoveContainer" containerID="d0a480e1553e1d5f62125757f8487f793e20dfb8099d9560721ae79d4f6927f9" Mar 19 09:50:28.382218 master-0 kubenswrapper[15202]: I0319 09:50:28.380247 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59697cf549-dzw8p" Mar 19 09:50:28.385099 master-0 kubenswrapper[15202]: I0319 09:50:28.385042 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" event={"ID":"98cda58d-8e34-4bcf-8169-179fa5f470cc","Type":"ContainerDied","Data":"440dcbc4eba725ffdff558b14407e715b7993442f78fd2a3de9f4dfa2a1c7e62"} Mar 19 09:50:28.386276 master-0 kubenswrapper[15202]: I0319 09:50:28.384779 15202 generic.go:334] "Generic (PLEG): container finished" podID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerID="440dcbc4eba725ffdff558b14407e715b7993442f78fd2a3de9f4dfa2a1c7e62" exitCode=0 Mar 19 09:50:28.395902 master-0 kubenswrapper[15202]: I0319 09:50:28.391924 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86659cf465-r6c25" Mar 19 09:50:28.395902 master-0 kubenswrapper[15202]: I0319 09:50:28.394007 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86659cf465-r6c25" event={"ID":"b3def07a-70e3-4a58-b0ce-60a0b208548f","Type":"ContainerDied","Data":"01edb1348d344e8f3c2bb5314f677084da5a8b165c9b8b4864eea730440b6b3a"} Mar 19 09:50:28.419962 master-0 kubenswrapper[15202]: I0319 09:50:28.416370 15202 scope.go:117] "RemoveContainer" containerID="446e11c09bc8dbc51e652ef97230f39d544148136a0ceeec266e128247b5f5db" Mar 19 09:50:28.547837 master-0 kubenswrapper[15202]: I0319 09:50:28.544957 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59697cf549-dzw8p"] Mar 19 09:50:28.574405 master-0 kubenswrapper[15202]: I0319 09:50:28.574336 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59697cf549-dzw8p"] Mar 19 09:50:28.847146 master-0 kubenswrapper[15202]: I0319 09:50:28.843340 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16865636-fcf3-49a3-bd27-dde9ebbd3549" path="/var/lib/kubelet/pods/16865636-fcf3-49a3-bd27-dde9ebbd3549/volumes" Mar 19 09:50:28.847146 master-0 kubenswrapper[15202]: I0319 09:50:28.844105 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86659cf465-r6c25"] Mar 19 09:50:28.847146 master-0 kubenswrapper[15202]: I0319 09:50:28.844140 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86659cf465-r6c25"] Mar 19 09:50:28.971739 master-0 kubenswrapper[15202]: I0319 09:50:28.971660 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:29.031731 master-0 kubenswrapper[15202]: I0319 09:50:29.031658 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmrcq\" (UniqueName: \"kubernetes.io/projected/9fefe00c-9546-4205-a4e5-a73e807d6bf4-kube-api-access-pmrcq\") pod \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " Mar 19 09:50:29.032004 master-0 kubenswrapper[15202]: I0319 09:50:29.031793 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle\") pod \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " Mar 19 09:50:29.032004 master-0 kubenswrapper[15202]: I0319 09:50:29.031873 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-config-data\") pod \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " Mar 19 09:50:29.046903 master-0 kubenswrapper[15202]: I0319 09:50:29.046769 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fefe00c-9546-4205-a4e5-a73e807d6bf4-kube-api-access-pmrcq" (OuterVolumeSpecName: "kube-api-access-pmrcq") pod "9fefe00c-9546-4205-a4e5-a73e807d6bf4" (UID: "9fefe00c-9546-4205-a4e5-a73e807d6bf4"). InnerVolumeSpecName "kube-api-access-pmrcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:29.105048 master-0 kubenswrapper[15202]: I0319 09:50:29.104340 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-config-data" (OuterVolumeSpecName: "config-data") pod "9fefe00c-9546-4205-a4e5-a73e807d6bf4" (UID: "9fefe00c-9546-4205-a4e5-a73e807d6bf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:29.134830 master-0 kubenswrapper[15202]: I0319 09:50:29.134756 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fefe00c-9546-4205-a4e5-a73e807d6bf4" (UID: "9fefe00c-9546-4205-a4e5-a73e807d6bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:29.135725 master-0 kubenswrapper[15202]: I0319 09:50:29.135657 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle\") pod \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\" (UID: \"9fefe00c-9546-4205-a4e5-a73e807d6bf4\") " Mar 19 09:50:29.136116 master-0 kubenswrapper[15202]: W0319 09:50:29.136081 15202 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9fefe00c-9546-4205-a4e5-a73e807d6bf4/volumes/kubernetes.io~secret/combined-ca-bundle Mar 19 09:50:29.136116 master-0 kubenswrapper[15202]: I0319 09:50:29.136106 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fefe00c-9546-4205-a4e5-a73e807d6bf4" (UID: "9fefe00c-9546-4205-a4e5-a73e807d6bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:29.139285 master-0 kubenswrapper[15202]: I0319 09:50:29.139175 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmrcq\" (UniqueName: \"kubernetes.io/projected/9fefe00c-9546-4205-a4e5-a73e807d6bf4-kube-api-access-pmrcq\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:29.139374 master-0 kubenswrapper[15202]: I0319 09:50:29.139288 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:29.139374 master-0 kubenswrapper[15202]: I0319 09:50:29.139303 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fefe00c-9546-4205-a4e5-a73e807d6bf4-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:29.425427 master-0 kubenswrapper[15202]: I0319 09:50:29.425022 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" event={"ID":"98cda58d-8e34-4bcf-8169-179fa5f470cc","Type":"ContainerStarted","Data":"bd0d6b3de802b5b7439b97d4b6b3d20c34f3058f37fec3f1fb6a34e64fa58307"} Mar 19 09:50:29.425427 master-0 kubenswrapper[15202]: I0319 09:50:29.425176 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:29.438201 master-0 kubenswrapper[15202]: I0319 09:50:29.438161 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vk8gz" event={"ID":"9fefe00c-9546-4205-a4e5-a73e807d6bf4","Type":"ContainerDied","Data":"49204e09f1e0033d5afb6395c670c144fedc912d60356abda5e5ea20969b8f15"} Mar 19 09:50:29.438201 master-0 kubenswrapper[15202]: I0319 09:50:29.438206 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49204e09f1e0033d5afb6395c670c144fedc912d60356abda5e5ea20969b8f15" Mar 19 09:50:29.438379 master-0 kubenswrapper[15202]: I0319 09:50:29.438255 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vk8gz" Mar 19 09:50:29.452579 master-0 kubenswrapper[15202]: I0319 09:50:29.452398 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" podStartSLOduration=3.452375403 podStartE2EDuration="3.452375403s" podCreationTimestamp="2026-03-19 09:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:29.448023605 +0000 UTC m=+1546.833438431" watchObservedRunningTime="2026-03-19 09:50:29.452375403 +0000 UTC m=+1546.837790219" Mar 19 09:50:29.725171 master-0 kubenswrapper[15202]: I0319 09:50:29.724997 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85f88f897-5c5kd"] Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.762721 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d8f46bbdf-cnrwt"] Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: E0319 09:50:29.763249 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fefe00c-9546-4205-a4e5-a73e807d6bf4" containerName="keystone-db-sync" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.763264 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fefe00c-9546-4205-a4e5-a73e807d6bf4" containerName="keystone-db-sync" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: E0319 09:50:29.763286 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3def07a-70e3-4a58-b0ce-60a0b208548f" containerName="init" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.763293 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3def07a-70e3-4a58-b0ce-60a0b208548f" containerName="init" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: E0319 09:50:29.763326 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16865636-fcf3-49a3-bd27-dde9ebbd3549" containerName="init" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.763332 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="16865636-fcf3-49a3-bd27-dde9ebbd3549" containerName="init" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.763697 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fefe00c-9546-4205-a4e5-a73e807d6bf4" containerName="keystone-db-sync" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.763720 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3def07a-70e3-4a58-b0ce-60a0b208548f" containerName="init" Mar 19 09:50:29.764588 master-0 kubenswrapper[15202]: I0319 09:50:29.763737 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="16865636-fcf3-49a3-bd27-dde9ebbd3549" containerName="init" Mar 19 09:50:29.765849 master-0 kubenswrapper[15202]: I0319 09:50:29.765811 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.785732 master-0 kubenswrapper[15202]: I0319 09:50:29.784736 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8f46bbdf-cnrwt"] Mar 19 09:50:29.810676 master-0 kubenswrapper[15202]: I0319 09:50:29.808565 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hlqwd"] Mar 19 09:50:29.810676 master-0 kubenswrapper[15202]: I0319 09:50:29.810035 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.813623 master-0 kubenswrapper[15202]: I0319 09:50:29.813588 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 09:50:29.813810 master-0 kubenswrapper[15202]: I0319 09:50:29.813756 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:50:29.814543 master-0 kubenswrapper[15202]: I0319 09:50:29.814522 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:50:29.814764 master-0 kubenswrapper[15202]: I0319 09:50:29.814537 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:50:29.847532 master-0 kubenswrapper[15202]: I0319 09:50:29.847451 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hlqwd"] Mar 19 09:50:29.973156 master-0 kubenswrapper[15202]: I0319 09:50:29.973024 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pft9k\" (UniqueName: \"kubernetes.io/projected/3d687cdb-7807-4089-8c5b-a840cfa6531e-kube-api-access-pft9k\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.973156 master-0 kubenswrapper[15202]: I0319 09:50:29.973141 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-config-data\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.973729 master-0 kubenswrapper[15202]: I0319 09:50:29.973704 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-edpm-a\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.973795 master-0 kubenswrapper[15202]: I0319 09:50:29.973776 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-combined-ca-bundle\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.973899 master-0 kubenswrapper[15202]: I0319 09:50:29.973858 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-sb\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.973948 master-0 kubenswrapper[15202]: I0319 09:50:29.973918 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-credential-keys\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974126 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76gc4\" (UniqueName: \"kubernetes.io/projected/44607811-622c-4f18-a414-0094958a5084-kube-api-access-76gc4\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974167 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-svc\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974191 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-nb\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974217 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-fernet-keys\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974248 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-swift-storage-0\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974275 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-scripts\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:29.977685 master-0 kubenswrapper[15202]: I0319 09:50:29.974316 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-config\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.005508 master-0 kubenswrapper[15202]: I0319 09:50:30.002344 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-db-sync-jdc2m"] Mar 19 09:50:30.005508 master-0 kubenswrapper[15202]: I0319 09:50:30.003909 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.007542 master-0 kubenswrapper[15202]: I0319 09:50:30.006969 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-scripts" Mar 19 09:50:30.007542 master-0 kubenswrapper[15202]: I0319 09:50:30.007269 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-config-data" Mar 19 09:50:30.049114 master-0 kubenswrapper[15202]: I0319 09:50:30.049056 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-db-sync-jdc2m"] Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.078386 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-nb\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.078451 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-fernet-keys\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.078501 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-swift-storage-0\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.078522 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-scripts\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.078547 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-config\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.079853 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-swift-storage-0\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.079904 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hbzpf"] Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.081732 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.082945 master-0 kubenswrapper[15202]: I0319 09:50:30.082856 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pft9k\" (UniqueName: \"kubernetes.io/projected/3d687cdb-7807-4089-8c5b-a840cfa6531e-kube-api-access-pft9k\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.084771 master-0 kubenswrapper[15202]: I0319 09:50:30.084744 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-config-data\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.084840 master-0 kubenswrapper[15202]: I0319 09:50:30.084798 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-edpm-a\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.084840 master-0 kubenswrapper[15202]: I0319 09:50:30.084829 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-combined-ca-bundle\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.084905 master-0 kubenswrapper[15202]: I0319 09:50:30.084852 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-sb\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.084905 master-0 kubenswrapper[15202]: I0319 09:50:30.084875 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-credential-keys\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.084983 master-0 kubenswrapper[15202]: I0319 09:50:30.084959 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76gc4\" (UniqueName: \"kubernetes.io/projected/44607811-622c-4f18-a414-0094958a5084-kube-api-access-76gc4\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.085023 master-0 kubenswrapper[15202]: I0319 09:50:30.084992 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-svc\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.085103 master-0 kubenswrapper[15202]: I0319 09:50:30.084753 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-fernet-keys\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.088113 master-0 kubenswrapper[15202]: I0319 09:50:30.088065 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 09:50:30.089828 master-0 kubenswrapper[15202]: I0319 09:50:30.089792 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-nb\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.090049 master-0 kubenswrapper[15202]: I0319 09:50:30.090019 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-edpm-a\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.090370 master-0 kubenswrapper[15202]: I0319 09:50:30.090345 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 09:50:30.090500 master-0 kubenswrapper[15202]: I0319 09:50:30.090461 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-config\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.091008 master-0 kubenswrapper[15202]: I0319 09:50:30.090986 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-combined-ca-bundle\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.091169 master-0 kubenswrapper[15202]: I0319 09:50:30.091144 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-sb\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.093318 master-0 kubenswrapper[15202]: I0319 09:50:30.091188 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-svc\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.093318 master-0 kubenswrapper[15202]: I0319 09:50:30.092239 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-scripts\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.101983 master-0 kubenswrapper[15202]: I0319 09:50:30.099349 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-config-data\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.102696 master-0 kubenswrapper[15202]: I0319 09:50:30.102656 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-credential-keys\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.112657 master-0 kubenswrapper[15202]: I0319 09:50:30.112618 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76gc4\" (UniqueName: \"kubernetes.io/projected/44607811-622c-4f18-a414-0094958a5084-kube-api-access-76gc4\") pod \"dnsmasq-dns-d8f46bbdf-cnrwt\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.136156 master-0 kubenswrapper[15202]: I0319 09:50:30.125839 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hbzpf"] Mar 19 09:50:30.136498 master-0 kubenswrapper[15202]: I0319 09:50:30.136244 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pft9k\" (UniqueName: \"kubernetes.io/projected/3d687cdb-7807-4089-8c5b-a840cfa6531e-kube-api-access-pft9k\") pod \"keystone-bootstrap-hlqwd\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.219203 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.220770 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvpg\" (UniqueName: \"kubernetes.io/projected/972a5655-2953-4875-b9cd-2b5481c6ff30-kube-api-access-nfvpg\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.220827 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04e14c4d-4d08-4c3c-8803-d39b03125169-etc-machine-id\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.220892 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-config-data\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.220934 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-combined-ca-bundle\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.220966 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-combined-ca-bundle\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.220986 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-db-sync-config-data\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.221043 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79gs\" (UniqueName: \"kubernetes.io/projected/04e14c4d-4d08-4c3c-8803-d39b03125169-kube-api-access-j79gs\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.221069 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-config\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.223389 master-0 kubenswrapper[15202]: I0319 09:50:30.221124 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-scripts\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.320480 master-0 kubenswrapper[15202]: I0319 09:50:30.320331 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2flmr"] Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.337714 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338233 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79gs\" (UniqueName: \"kubernetes.io/projected/04e14c4d-4d08-4c3c-8803-d39b03125169-kube-api-access-j79gs\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338366 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-config\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338556 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-scripts\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338697 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvpg\" (UniqueName: \"kubernetes.io/projected/972a5655-2953-4875-b9cd-2b5481c6ff30-kube-api-access-nfvpg\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338777 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04e14c4d-4d08-4c3c-8803-d39b03125169-etc-machine-id\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338880 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-config-data\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.338939 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-combined-ca-bundle\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.339014 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-combined-ca-bundle\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.340489 master-0 kubenswrapper[15202]: I0319 09:50:30.339041 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-db-sync-config-data\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.347494 master-0 kubenswrapper[15202]: I0319 09:50:30.344675 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 09:50:30.347494 master-0 kubenswrapper[15202]: I0319 09:50:30.344926 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 09:50:30.347494 master-0 kubenswrapper[15202]: I0319 09:50:30.345555 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04e14c4d-4d08-4c3c-8803-d39b03125169-etc-machine-id\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.352801 master-0 kubenswrapper[15202]: I0319 09:50:30.352037 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-scripts\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.352801 master-0 kubenswrapper[15202]: I0319 09:50:30.352430 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-combined-ca-bundle\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.377795 master-0 kubenswrapper[15202]: I0319 09:50:30.362908 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-config\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.386642 master-0 kubenswrapper[15202]: I0319 09:50:30.378789 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2flmr"] Mar 19 09:50:30.386642 master-0 kubenswrapper[15202]: I0319 09:50:30.382048 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79gs\" (UniqueName: \"kubernetes.io/projected/04e14c4d-4d08-4c3c-8803-d39b03125169-kube-api-access-j79gs\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.386642 master-0 kubenswrapper[15202]: I0319 09:50:30.385096 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvpg\" (UniqueName: \"kubernetes.io/projected/972a5655-2953-4875-b9cd-2b5481c6ff30-kube-api-access-nfvpg\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.401597 master-0 kubenswrapper[15202]: I0319 09:50:30.399353 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8f46bbdf-cnrwt"] Mar 19 09:50:30.401597 master-0 kubenswrapper[15202]: I0319 09:50:30.400155 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-combined-ca-bundle\") pod \"neutron-db-sync-hbzpf\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.401597 master-0 kubenswrapper[15202]: I0319 09:50:30.400255 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:30.446100 master-0 kubenswrapper[15202]: I0319 09:50:30.444062 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-config-data\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.446100 master-0 kubenswrapper[15202]: I0319 09:50:30.444698 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-db-sync-config-data\") pod \"cinder-7ba05-db-sync-jdc2m\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.453043 master-0 kubenswrapper[15202]: I0319 09:50:30.449815 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdhl\" (UniqueName: \"kubernetes.io/projected/85ab6d34-c24f-4e22-ac73-939b5a791240-kube-api-access-jhdhl\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.453043 master-0 kubenswrapper[15202]: I0319 09:50:30.450171 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-config-data\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.453043 master-0 kubenswrapper[15202]: I0319 09:50:30.450326 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-combined-ca-bundle\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.453043 master-0 kubenswrapper[15202]: I0319 09:50:30.450393 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-scripts\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.453043 master-0 kubenswrapper[15202]: I0319 09:50:30.450527 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ab6d34-c24f-4e22-ac73-939b5a791240-logs\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.486518 master-0 kubenswrapper[15202]: I0319 09:50:30.486336 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb6bf676c-xlvsw"] Mar 19 09:50:30.495322 master-0 kubenswrapper[15202]: I0319 09:50:30.495266 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.506580 master-0 kubenswrapper[15202]: I0319 09:50:30.506523 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb6bf676c-xlvsw"] Mar 19 09:50:30.537091 master-0 kubenswrapper[15202]: I0319 09:50:30.536970 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:50:30.552885 master-0 kubenswrapper[15202]: I0319 09:50:30.552815 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-config-data\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.553046 master-0 kubenswrapper[15202]: I0319 09:50:30.552928 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-combined-ca-bundle\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.553046 master-0 kubenswrapper[15202]: I0319 09:50:30.552961 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-scripts\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.553046 master-0 kubenswrapper[15202]: I0319 09:50:30.552997 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ab6d34-c24f-4e22-ac73-939b5a791240-logs\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.553046 master-0 kubenswrapper[15202]: I0319 09:50:30.553032 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdhl\" (UniqueName: \"kubernetes.io/projected/85ab6d34-c24f-4e22-ac73-939b5a791240-kube-api-access-jhdhl\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.556728 master-0 kubenswrapper[15202]: I0319 09:50:30.556698 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-config-data\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.558409 master-0 kubenswrapper[15202]: I0319 09:50:30.558367 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ab6d34-c24f-4e22-ac73-939b5a791240-logs\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.562790 master-0 kubenswrapper[15202]: I0319 09:50:30.562676 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-scripts\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.562895 master-0 kubenswrapper[15202]: I0319 09:50:30.562842 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-combined-ca-bundle\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.572761 master-0 kubenswrapper[15202]: I0319 09:50:30.571231 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdhl\" (UniqueName: \"kubernetes.io/projected/85ab6d34-c24f-4e22-ac73-939b5a791240-kube-api-access-jhdhl\") pod \"placement-db-sync-2flmr\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.648547 master-0 kubenswrapper[15202]: I0319 09:50:30.648452 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:50:30.658680 master-0 kubenswrapper[15202]: I0319 09:50:30.658617 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-edpm-a\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.658919 master-0 kubenswrapper[15202]: I0319 09:50:30.658709 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-config\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.658960 master-0 kubenswrapper[15202]: I0319 09:50:30.658900 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-svc\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.661553 master-0 kubenswrapper[15202]: I0319 09:50:30.659622 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbck\" (UniqueName: \"kubernetes.io/projected/76d834e7-4d24-4e34-8ebd-b71c80766e40-kube-api-access-jdbck\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.663617 master-0 kubenswrapper[15202]: I0319 09:50:30.663232 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.663773 master-0 kubenswrapper[15202]: I0319 09:50:30.663721 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-swift-storage-0\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.663830 master-0 kubenswrapper[15202]: I0319 09:50:30.663782 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.773239 master-0 kubenswrapper[15202]: I0319 09:50:30.772597 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2flmr" Mar 19 09:50:30.778373 master-0 kubenswrapper[15202]: I0319 09:50:30.777983 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-swift-storage-0\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.778373 master-0 kubenswrapper[15202]: I0319 09:50:30.778068 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.778373 master-0 kubenswrapper[15202]: I0319 09:50:30.778160 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-edpm-a\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.779012 master-0 kubenswrapper[15202]: I0319 09:50:30.778828 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-config\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.779012 master-0 kubenswrapper[15202]: I0319 09:50:30.778928 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-svc\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.779337 master-0 kubenswrapper[15202]: I0319 09:50:30.779042 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbck\" (UniqueName: \"kubernetes.io/projected/76d834e7-4d24-4e34-8ebd-b71c80766e40-kube-api-access-jdbck\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.779337 master-0 kubenswrapper[15202]: I0319 09:50:30.779192 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.779679 master-0 kubenswrapper[15202]: I0319 09:50:30.779577 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-config\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.780283 master-0 kubenswrapper[15202]: I0319 09:50:30.780205 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.780400 master-0 kubenswrapper[15202]: I0319 09:50:30.780373 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-swift-storage-0\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.780575 master-0 kubenswrapper[15202]: I0319 09:50:30.780506 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-edpm-a\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.781087 master-0 kubenswrapper[15202]: I0319 09:50:30.780917 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.781087 master-0 kubenswrapper[15202]: I0319 09:50:30.781021 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-svc\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.802700 master-0 kubenswrapper[15202]: I0319 09:50:30.802579 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbck\" (UniqueName: \"kubernetes.io/projected/76d834e7-4d24-4e34-8ebd-b71c80766e40-kube-api-access-jdbck\") pod \"dnsmasq-dns-7cb6bf676c-xlvsw\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:30.839538 master-0 kubenswrapper[15202]: I0319 09:50:30.839451 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3def07a-70e3-4a58-b0ce-60a0b208548f" path="/var/lib/kubelet/pods/b3def07a-70e3-4a58-b0ce-60a0b208548f/volumes" Mar 19 09:50:30.849165 master-0 kubenswrapper[15202]: I0319 09:50:30.849058 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:31.505524 master-0 kubenswrapper[15202]: I0319 09:50:31.503786 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerName="dnsmasq-dns" containerID="cri-o://bd0d6b3de802b5b7439b97d4b6b3d20c34f3058f37fec3f1fb6a34e64fa58307" gracePeriod=10 Mar 19 09:50:31.924814 master-0 kubenswrapper[15202]: I0319 09:50:31.924617 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:31.928133 master-0 kubenswrapper[15202]: I0319 09:50:31.928066 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:31.949452 master-0 kubenswrapper[15202]: I0319 09:50:31.949392 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 09:50:31.949702 master-0 kubenswrapper[15202]: I0319 09:50:31.949657 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 09:50:31.951136 master-0 kubenswrapper[15202]: I0319 09:50:31.951107 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-default-external-config-data" Mar 19 09:50:31.988118 master-0 kubenswrapper[15202]: I0319 09:50:31.988061 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:32.115639 master-0 kubenswrapper[15202]: I0319 09:50:32.115546 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.115846 master-0 kubenswrapper[15202]: I0319 09:50:32.115668 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.115846 master-0 kubenswrapper[15202]: I0319 09:50:32.115759 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.115956 master-0 kubenswrapper[15202]: I0319 09:50:32.115913 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.116014 master-0 kubenswrapper[15202]: I0319 09:50:32.115953 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.116325 master-0 kubenswrapper[15202]: I0319 09:50:32.116263 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.116378 master-0 kubenswrapper[15202]: I0319 09:50:32.116331 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.116412 master-0 kubenswrapper[15202]: I0319 09:50:32.116387 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hv5l\" (UniqueName: \"kubernetes.io/projected/ec4577f9-acc7-4637-b86b-4e1f63f3b477-kube-api-access-4hv5l\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219716 master-0 kubenswrapper[15202]: I0319 09:50:32.219621 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219716 master-0 kubenswrapper[15202]: I0319 09:50:32.219699 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219997 master-0 kubenswrapper[15202]: I0319 09:50:32.219739 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219997 master-0 kubenswrapper[15202]: I0319 09:50:32.219901 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219997 master-0 kubenswrapper[15202]: I0319 09:50:32.219927 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219997 master-0 kubenswrapper[15202]: I0319 09:50:32.219966 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.219997 master-0 kubenswrapper[15202]: I0319 09:50:32.219985 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.220157 master-0 kubenswrapper[15202]: I0319 09:50:32.220010 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hv5l\" (UniqueName: \"kubernetes.io/projected/ec4577f9-acc7-4637-b86b-4e1f63f3b477-kube-api-access-4hv5l\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.220431 master-0 kubenswrapper[15202]: I0319 09:50:32.220389 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.220630 master-0 kubenswrapper[15202]: I0319 09:50:32.220606 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.223430 master-0 kubenswrapper[15202]: I0319 09:50:32.223342 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:50:32.223430 master-0 kubenswrapper[15202]: I0319 09:50:32.223378 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/761159666a2fdb4e1d7229cd039b70780d5ca1904241b607263d6bd54bcba60c/globalmount\"" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.225587 master-0 kubenswrapper[15202]: I0319 09:50:32.225558 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.226409 master-0 kubenswrapper[15202]: I0319 09:50:32.226344 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.226537 master-0 kubenswrapper[15202]: I0319 09:50:32.226514 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.227966 master-0 kubenswrapper[15202]: I0319 09:50:32.227890 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.238085 master-0 kubenswrapper[15202]: I0319 09:50:32.237994 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hv5l\" (UniqueName: \"kubernetes.io/projected/ec4577f9-acc7-4637-b86b-4e1f63f3b477-kube-api-access-4hv5l\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:32.518576 master-0 kubenswrapper[15202]: I0319 09:50:32.518508 15202 generic.go:334] "Generic (PLEG): container finished" podID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerID="bd0d6b3de802b5b7439b97d4b6b3d20c34f3058f37fec3f1fb6a34e64fa58307" exitCode=0 Mar 19 09:50:32.519110 master-0 kubenswrapper[15202]: I0319 09:50:32.518579 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" event={"ID":"98cda58d-8e34-4bcf-8169-179fa5f470cc","Type":"ContainerDied","Data":"bd0d6b3de802b5b7439b97d4b6b3d20c34f3058f37fec3f1fb6a34e64fa58307"} Mar 19 09:50:33.042413 master-0 kubenswrapper[15202]: I0319 09:50:33.042334 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:33.044152 master-0 kubenswrapper[15202]: I0319 09:50:33.044108 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.049081 master-0 kubenswrapper[15202]: I0319 09:50:33.049043 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-default-internal-config-data" Mar 19 09:50:33.049287 master-0 kubenswrapper[15202]: I0319 09:50:33.049256 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 09:50:33.248131 master-0 kubenswrapper[15202]: I0319 09:50:33.248059 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.248384 master-0 kubenswrapper[15202]: I0319 09:50:33.248275 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.248384 master-0 kubenswrapper[15202]: I0319 09:50:33.248312 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls8gz\" (UniqueName: \"kubernetes.io/projected/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-kube-api-access-ls8gz\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.248498 master-0 kubenswrapper[15202]: I0319 09:50:33.248388 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.248498 master-0 kubenswrapper[15202]: I0319 09:50:33.248426 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.248638 master-0 kubenswrapper[15202]: I0319 09:50:33.248597 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.249152 master-0 kubenswrapper[15202]: I0319 09:50:33.249056 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:33.249266 master-0 kubenswrapper[15202]: I0319 09:50:33.249199 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056280 master-0 kubenswrapper[15202]: I0319 09:50:34.055979 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056280 master-0 kubenswrapper[15202]: I0319 09:50:34.056226 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056982 master-0 kubenswrapper[15202]: I0319 09:50:34.056350 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056982 master-0 kubenswrapper[15202]: I0319 09:50:34.056609 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056982 master-0 kubenswrapper[15202]: I0319 09:50:34.056650 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls8gz\" (UniqueName: \"kubernetes.io/projected/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-kube-api-access-ls8gz\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056982 master-0 kubenswrapper[15202]: I0319 09:50:34.056798 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.056982 master-0 kubenswrapper[15202]: I0319 09:50:34.056960 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.057417 master-0 kubenswrapper[15202]: I0319 09:50:34.057369 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.058232 master-0 kubenswrapper[15202]: I0319 09:50:34.057848 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.079248 master-0 kubenswrapper[15202]: I0319 09:50:34.065915 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.098700 master-0 kubenswrapper[15202]: I0319 09:50:34.084194 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.098700 master-0 kubenswrapper[15202]: I0319 09:50:34.086449 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.134942 master-0 kubenswrapper[15202]: I0319 09:50:34.134895 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls8gz\" (UniqueName: \"kubernetes.io/projected/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-kube-api-access-ls8gz\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.142538 master-0 kubenswrapper[15202]: I0319 09:50:34.141071 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.159940 master-0 kubenswrapper[15202]: I0319 09:50:34.159878 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.166905 master-0 kubenswrapper[15202]: I0319 09:50:34.166852 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:50:34.167231 master-0 kubenswrapper[15202]: I0319 09:50:34.166918 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4408f40dfb1603f9af45ac684ff95accfb01fbfdee7d66269d29c583d6626950/globalmount\"" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:34.192165 master-0 kubenswrapper[15202]: I0319 09:50:34.192109 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:34.237620 master-0 kubenswrapper[15202]: I0319 09:50:34.237450 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hbzpf"] Mar 19 09:50:34.596804 master-0 kubenswrapper[15202]: I0319 09:50:34.589792 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:34.596804 master-0 kubenswrapper[15202]: E0319 09:50:34.592123 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-3a5fd-default-external-api-0" podUID="ec4577f9-acc7-4637-b86b-4e1f63f3b477" Mar 19 09:50:34.841794 master-0 kubenswrapper[15202]: I0319 09:50:34.838120 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:34.911010 master-0 kubenswrapper[15202]: I0319 09:50:34.910934 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-edpm-a\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:34.913484 master-0 kubenswrapper[15202]: I0319 09:50:34.911761 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-sb\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:35.019363 master-0 kubenswrapper[15202]: I0319 09:50:35.019165 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-swift-storage-0\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:35.019363 master-0 kubenswrapper[15202]: I0319 09:50:35.019294 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-svc\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:35.019536 master-0 kubenswrapper[15202]: I0319 09:50:35.019407 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/98cda58d-8e34-4bcf-8169-179fa5f470cc-kube-api-access-mppjl\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:35.019536 master-0 kubenswrapper[15202]: I0319 09:50:35.019451 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-nb\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:35.019536 master-0 kubenswrapper[15202]: I0319 09:50:35.019502 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-config\") pod \"98cda58d-8e34-4bcf-8169-179fa5f470cc\" (UID: \"98cda58d-8e34-4bcf-8169-179fa5f470cc\") " Mar 19 09:50:35.028888 master-0 kubenswrapper[15202]: I0319 09:50:35.028759 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:35.032182 master-0 kubenswrapper[15202]: I0319 09:50:35.031454 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cda58d-8e34-4bcf-8169-179fa5f470cc-kube-api-access-mppjl" (OuterVolumeSpecName: "kube-api-access-mppjl") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "kube-api-access-mppjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:35.034835 master-0 kubenswrapper[15202]: I0319 09:50:35.034784 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb6bf676c-xlvsw"] Mar 19 09:50:35.055591 master-0 kubenswrapper[15202]: W0319 09:50:35.055061 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76d834e7_4d24_4e34_8ebd_b71c80766e40.slice/crio-a60a2f6ac5cc3ceb3c2ba6142fefb082d56944de0cc0fa9989230c1facd37f39 WatchSource:0}: Error finding container a60a2f6ac5cc3ceb3c2ba6142fefb082d56944de0cc0fa9989230c1facd37f39: Status 404 returned error can't find the container with id a60a2f6ac5cc3ceb3c2ba6142fefb082d56944de0cc0fa9989230c1facd37f39 Mar 19 09:50:35.074682 master-0 kubenswrapper[15202]: I0319 09:50:35.074585 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hlqwd"] Mar 19 09:50:35.122776 master-0 kubenswrapper[15202]: I0319 09:50:35.122623 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:35.134910 master-0 kubenswrapper[15202]: I0319 09:50:35.134211 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/98cda58d-8e34-4bcf-8169-179fa5f470cc-kube-api-access-mppjl\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.134910 master-0 kubenswrapper[15202]: I0319 09:50:35.134273 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.134910 master-0 kubenswrapper[15202]: I0319 09:50:35.134285 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.143518 master-0 kubenswrapper[15202]: I0319 09:50:35.143447 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:35.156256 master-0 kubenswrapper[15202]: I0319 09:50:35.155010 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2flmr"] Mar 19 09:50:35.158697 master-0 kubenswrapper[15202]: E0319 09:50:35.154920 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-3a5fd-default-internal-api-0" podUID="586b7f66-ec6d-4168-bb8a-87b3c04d82ce" Mar 19 09:50:35.166150 master-0 kubenswrapper[15202]: I0319 09:50:35.165409 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:35.189752 master-0 kubenswrapper[15202]: I0319 09:50:35.182064 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8f46bbdf-cnrwt"] Mar 19 09:50:35.189752 master-0 kubenswrapper[15202]: I0319 09:50:35.182862 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hbzpf" event={"ID":"972a5655-2953-4875-b9cd-2b5481c6ff30","Type":"ContainerStarted","Data":"62d661213ddc7402b16f43961ab0ddd88189e01cab778f58d3c4309a43d8ab8e"} Mar 19 09:50:35.189752 master-0 kubenswrapper[15202]: I0319 09:50:35.182920 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hbzpf" event={"ID":"972a5655-2953-4875-b9cd-2b5481c6ff30","Type":"ContainerStarted","Data":"a0cfb7f46baa0485fd82421ca1b67a699d750cf13a02bc795d246030b201122e"} Mar 19 09:50:35.199543 master-0 kubenswrapper[15202]: I0319 09:50:35.196594 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2flmr" event={"ID":"85ab6d34-c24f-4e22-ac73-939b5a791240","Type":"ContainerStarted","Data":"d5d6734f49e45387f341c2c0bce19efa0917a0acbedddc64e04117c4628873c5"} Mar 19 09:50:35.216125 master-0 kubenswrapper[15202]: I0319 09:50:35.207642 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" event={"ID":"98cda58d-8e34-4bcf-8169-179fa5f470cc","Type":"ContainerDied","Data":"80d3bdead8473a22af41f622af4b4132416e3514989479a0f66199372aee5b35"} Mar 19 09:50:35.216125 master-0 kubenswrapper[15202]: I0319 09:50:35.207706 15202 scope.go:117] "RemoveContainer" containerID="bd0d6b3de802b5b7439b97d4b6b3d20c34f3058f37fec3f1fb6a34e64fa58307" Mar 19 09:50:35.216125 master-0 kubenswrapper[15202]: I0319 09:50:35.212436 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f88f897-5c5kd" Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: W0319 09:50:35.231459 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44607811_622c_4f18_a414_0094958a5084.slice/crio-998d7640fe6cc091ff35cf9cd43076700a92c870d6814de45e7f4793ade9d372 WatchSource:0}: Error finding container 998d7640fe6cc091ff35cf9cd43076700a92c870d6814de45e7f4793ade9d372: Status 404 returned error can't find the container with id 998d7640fe6cc091ff35cf9cd43076700a92c870d6814de45e7f4793ade9d372 Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.232005 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlqwd" event={"ID":"3d687cdb-7807-4089-8c5b-a840cfa6531e","Type":"ContainerStarted","Data":"23094c261997bb4a3a38c266cd7b551df577ad129dfa25f98c58b4ae9533abc8"} Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.232606 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-config" (OuterVolumeSpecName: "config") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.234900 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.234901 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" event={"ID":"76d834e7-4d24-4e34-8ebd-b71c80766e40","Type":"ContainerStarted","Data":"a60a2f6ac5cc3ceb3c2ba6142fefb082d56944de0cc0fa9989230c1facd37f39"} Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.237931 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.237973 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.239570 master-0 kubenswrapper[15202]: I0319 09:50:35.238259 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:35.253717 master-0 kubenswrapper[15202]: I0319 09:50:35.249121 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-db-sync-jdc2m"] Mar 19 09:50:35.263523 master-0 kubenswrapper[15202]: I0319 09:50:35.256691 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hbzpf" podStartSLOduration=5.256663554 podStartE2EDuration="5.256663554s" podCreationTimestamp="2026-03-19 09:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:35.211443719 +0000 UTC m=+1552.596858535" watchObservedRunningTime="2026-03-19 09:50:35.256663554 +0000 UTC m=+1552.642078380" Mar 19 09:50:35.272560 master-0 kubenswrapper[15202]: I0319 09:50:35.270034 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98cda58d-8e34-4bcf-8169-179fa5f470cc" (UID: "98cda58d-8e34-4bcf-8169-179fa5f470cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:35.287752 master-0 kubenswrapper[15202]: I0319 09:50:35.285014 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:35.339583 master-0 kubenswrapper[15202]: I0319 09:50:35.339347 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.339583 master-0 kubenswrapper[15202]: I0319 09:50:35.339391 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cda58d-8e34-4bcf-8169-179fa5f470cc-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.445882 master-0 kubenswrapper[15202]: I0319 09:50:35.445838 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:35.505822 master-0 kubenswrapper[15202]: I0319 09:50:35.505698 15202 scope.go:117] "RemoveContainer" containerID="440dcbc4eba725ffdff558b14407e715b7993442f78fd2a3de9f4dfa2a1c7e62" Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542133 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-scripts\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542212 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-combined-ca-bundle\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542349 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-public-tls-certs\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542389 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-config-data\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542548 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542612 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-httpd-run\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542679 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-logs\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.543636 master-0 kubenswrapper[15202]: I0319 09:50:35.542729 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hv5l\" (UniqueName: \"kubernetes.io/projected/ec4577f9-acc7-4637-b86b-4e1f63f3b477-kube-api-access-4hv5l\") pod \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\" (UID: \"ec4577f9-acc7-4637-b86b-4e1f63f3b477\") " Mar 19 09:50:35.550121 master-0 kubenswrapper[15202]: I0319 09:50:35.547599 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:35.550121 master-0 kubenswrapper[15202]: I0319 09:50:35.548993 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-logs" (OuterVolumeSpecName: "logs") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:35.559052 master-0 kubenswrapper[15202]: I0319 09:50:35.558394 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:35.565443 master-0 kubenswrapper[15202]: I0319 09:50:35.565291 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85f88f897-5c5kd"] Mar 19 09:50:35.577208 master-0 kubenswrapper[15202]: I0319 09:50:35.576425 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85f88f897-5c5kd"] Mar 19 09:50:35.581793 master-0 kubenswrapper[15202]: I0319 09:50:35.581251 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-config-data" (OuterVolumeSpecName: "config-data") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:35.585212 master-0 kubenswrapper[15202]: I0319 09:50:35.585133 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-scripts" (OuterVolumeSpecName: "scripts") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:35.585349 master-0 kubenswrapper[15202]: I0319 09:50:35.585248 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec4577f9-acc7-4637-b86b-4e1f63f3b477-kube-api-access-4hv5l" (OuterVolumeSpecName: "kube-api-access-4hv5l") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "kube-api-access-4hv5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:35.589535 master-0 kubenswrapper[15202]: I0319 09:50:35.589099 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644361 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644407 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644421 15202 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644446 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4577f9-acc7-4637-b86b-4e1f63f3b477-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644457 15202 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644486 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec4577f9-acc7-4637-b86b-4e1f63f3b477-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:35.644747 master-0 kubenswrapper[15202]: I0319 09:50:35.644496 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hv5l\" (UniqueName: \"kubernetes.io/projected/ec4577f9-acc7-4637-b86b-4e1f63f3b477-kube-api-access-4hv5l\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.289720 master-0 kubenswrapper[15202]: I0319 09:50:36.289636 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlqwd" event={"ID":"3d687cdb-7807-4089-8c5b-a840cfa6531e","Type":"ContainerStarted","Data":"c52b2aa7760d01a923982628c21e81cd4bdbb7bcc63829779172f81c54b5d42a"} Mar 19 09:50:36.309208 master-0 kubenswrapper[15202]: I0319 09:50:36.308382 15202 generic.go:334] "Generic (PLEG): container finished" podID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerID="a14e8229932890689469fd4553d6d7254e31b6aa71d9d75a552b4a62a0b99786" exitCode=0 Mar 19 09:50:36.309208 master-0 kubenswrapper[15202]: I0319 09:50:36.308743 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" event={"ID":"76d834e7-4d24-4e34-8ebd-b71c80766e40","Type":"ContainerDied","Data":"a14e8229932890689469fd4553d6d7254e31b6aa71d9d75a552b4a62a0b99786"} Mar 19 09:50:36.326930 master-0 kubenswrapper[15202]: I0319 09:50:36.324683 15202 generic.go:334] "Generic (PLEG): container finished" podID="44607811-622c-4f18-a414-0094958a5084" containerID="ac8dab71c2aaf447bb4eda7d2f77476bf3deec222b02c647bd2c5eca4c24e7be" exitCode=0 Mar 19 09:50:36.326930 master-0 kubenswrapper[15202]: I0319 09:50:36.324809 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" event={"ID":"44607811-622c-4f18-a414-0094958a5084","Type":"ContainerDied","Data":"ac8dab71c2aaf447bb4eda7d2f77476bf3deec222b02c647bd2c5eca4c24e7be"} Mar 19 09:50:36.326930 master-0 kubenswrapper[15202]: I0319 09:50:36.324842 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" event={"ID":"44607811-622c-4f18-a414-0094958a5084","Type":"ContainerStarted","Data":"998d7640fe6cc091ff35cf9cd43076700a92c870d6814de45e7f4793ade9d372"} Mar 19 09:50:36.334768 master-0 kubenswrapper[15202]: I0319 09:50:36.334716 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-db-sync-jdc2m" event={"ID":"04e14c4d-4d08-4c3c-8803-d39b03125169","Type":"ContainerStarted","Data":"7b2bddc13623d27f26a4e91cb34264c17999c7129802cb78be8b0045c4365d98"} Mar 19 09:50:36.334953 master-0 kubenswrapper[15202]: I0319 09:50:36.334802 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:36.335123 master-0 kubenswrapper[15202]: I0319 09:50:36.335103 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:36.360457 master-0 kubenswrapper[15202]: I0319 09:50:36.359531 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hlqwd" podStartSLOduration=7.359509568 podStartE2EDuration="7.359509568s" podCreationTimestamp="2026-03-19 09:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:36.328322489 +0000 UTC m=+1553.713737315" watchObservedRunningTime="2026-03-19 09:50:36.359509568 +0000 UTC m=+1553.744924374" Mar 19 09:50:36.475494 master-0 kubenswrapper[15202]: I0319 09:50:36.475342 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:36.542982 master-0 kubenswrapper[15202]: I0319 09:50:36.542806 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688" (OuterVolumeSpecName: "glance") pod "ec4577f9-acc7-4637-b86b-4e1f63f3b477" (UID: "ec4577f9-acc7-4637-b86b-4e1f63f3b477"). InnerVolumeSpecName "pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:50:36.557782 master-0 kubenswrapper[15202]: I0319 09:50:36.557352 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.616157 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-combined-ca-bundle\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.616222 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-scripts\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.616284 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls8gz\" (UniqueName: \"kubernetes.io/projected/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-kube-api-access-ls8gz\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.616356 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-httpd-run\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.617129 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.617179 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-internal-tls-certs\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.617294 master-0 kubenswrapper[15202]: I0319 09:50:36.617220 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-config-data\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.618305 master-0 kubenswrapper[15202]: I0319 09:50:36.617357 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-logs\") pod \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\" (UID: \"586b7f66-ec6d-4168-bb8a-87b3c04d82ce\") " Mar 19 09:50:36.621760 master-0 kubenswrapper[15202]: I0319 09:50:36.619755 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:36.623574 master-0 kubenswrapper[15202]: I0319 09:50:36.622909 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-scripts" (OuterVolumeSpecName: "scripts") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:36.628254 master-0 kubenswrapper[15202]: I0319 09:50:36.624682 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-logs" (OuterVolumeSpecName: "logs") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:36.628254 master-0 kubenswrapper[15202]: I0319 09:50:36.624937 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.628254 master-0 kubenswrapper[15202]: I0319 09:50:36.624963 15202 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.628254 master-0 kubenswrapper[15202]: I0319 09:50:36.625052 15202 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") on node \"master-0\" " Mar 19 09:50:36.631713 master-0 kubenswrapper[15202]: I0319 09:50:36.628637 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-config-data" (OuterVolumeSpecName: "config-data") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:36.634834 master-0 kubenswrapper[15202]: I0319 09:50:36.632027 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-kube-api-access-ls8gz" (OuterVolumeSpecName: "kube-api-access-ls8gz") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "kube-api-access-ls8gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:36.645601 master-0 kubenswrapper[15202]: I0319 09:50:36.644887 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097" (OuterVolumeSpecName: "glance") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:50:36.650477 master-0 kubenswrapper[15202]: I0319 09:50:36.650281 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:36.690939 master-0 kubenswrapper[15202]: I0319 09:50:36.690859 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "586b7f66-ec6d-4168-bb8a-87b3c04d82ce" (UID: "586b7f66-ec6d-4168-bb8a-87b3c04d82ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:36.691910 master-0 kubenswrapper[15202]: I0319 09:50:36.691889 15202 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:50:36.692048 master-0 kubenswrapper[15202]: I0319 09:50:36.692037 15202 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f" (UniqueName: "kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688") on node "master-0" Mar 19 09:50:36.729778 master-0 kubenswrapper[15202]: I0319 09:50:36.729704 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.729778 master-0 kubenswrapper[15202]: I0319 09:50:36.729752 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.729778 master-0 kubenswrapper[15202]: I0319 09:50:36.729764 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls8gz\" (UniqueName: \"kubernetes.io/projected/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-kube-api-access-ls8gz\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.730087 master-0 kubenswrapper[15202]: I0319 09:50:36.729802 15202 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") on node \"master-0\" " Mar 19 09:50:36.730087 master-0 kubenswrapper[15202]: I0319 09:50:36.729814 15202 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.730087 master-0 kubenswrapper[15202]: I0319 09:50:36.729824 15202 reconciler_common.go:293] "Volume detached for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.730087 master-0 kubenswrapper[15202]: I0319 09:50:36.729840 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586b7f66-ec6d-4168-bb8a-87b3c04d82ce-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.822600 master-0 kubenswrapper[15202]: I0319 09:50:36.821588 15202 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:50:36.822600 master-0 kubenswrapper[15202]: I0319 09:50:36.821762 15202 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba" (UniqueName: "kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097") on node "master-0" Mar 19 09:50:36.836612 master-0 kubenswrapper[15202]: I0319 09:50:36.836032 15202 reconciler_common.go:293] "Volume detached for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:36.849961 master-0 kubenswrapper[15202]: I0319 09:50:36.841984 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" path="/var/lib/kubelet/pods/98cda58d-8e34-4bcf-8169-179fa5f470cc/volumes" Mar 19 09:50:36.883864 master-0 kubenswrapper[15202]: I0319 09:50:36.883797 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:36.902645 master-0 kubenswrapper[15202]: I0319 09:50:36.902534 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:36.912309 master-0 kubenswrapper[15202]: I0319 09:50:36.907575 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:36.920629 master-0 kubenswrapper[15202]: I0319 09:50:36.920533 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:36.927088 master-0 kubenswrapper[15202]: E0319 09:50:36.927002 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerName="dnsmasq-dns" Mar 19 09:50:36.927279 master-0 kubenswrapper[15202]: I0319 09:50:36.927090 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerName="dnsmasq-dns" Mar 19 09:50:36.927279 master-0 kubenswrapper[15202]: E0319 09:50:36.927148 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerName="init" Mar 19 09:50:36.927279 master-0 kubenswrapper[15202]: I0319 09:50:36.927158 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerName="init" Mar 19 09:50:36.927279 master-0 kubenswrapper[15202]: E0319 09:50:36.927199 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44607811-622c-4f18-a414-0094958a5084" containerName="init" Mar 19 09:50:36.927279 master-0 kubenswrapper[15202]: I0319 09:50:36.927207 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="44607811-622c-4f18-a414-0094958a5084" containerName="init" Mar 19 09:50:36.930938 master-0 kubenswrapper[15202]: I0319 09:50:36.928154 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cda58d-8e34-4bcf-8169-179fa5f470cc" containerName="dnsmasq-dns" Mar 19 09:50:36.930938 master-0 kubenswrapper[15202]: I0319 09:50:36.928193 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="44607811-622c-4f18-a414-0094958a5084" containerName="init" Mar 19 09:50:36.932763 master-0 kubenswrapper[15202]: I0319 09:50:36.932698 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:36.936111 master-0 kubenswrapper[15202]: I0319 09:50:36.936017 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-default-external-config-data" Mar 19 09:50:36.943121 master-0 kubenswrapper[15202]: I0319 09:50:36.943079 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:36.944941 master-0 kubenswrapper[15202]: I0319 09:50:36.944588 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 09:50:37.040957 master-0 kubenswrapper[15202]: I0319 09:50:37.040897 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-svc\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041181 master-0 kubenswrapper[15202]: I0319 09:50:37.041048 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-nb\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041181 master-0 kubenswrapper[15202]: I0319 09:50:37.041076 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-config\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041181 master-0 kubenswrapper[15202]: I0319 09:50:37.041117 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-swift-storage-0\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041295 master-0 kubenswrapper[15202]: I0319 09:50:37.041187 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76gc4\" (UniqueName: \"kubernetes.io/projected/44607811-622c-4f18-a414-0094958a5084-kube-api-access-76gc4\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041295 master-0 kubenswrapper[15202]: I0319 09:50:37.041217 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-edpm-a\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041295 master-0 kubenswrapper[15202]: I0319 09:50:37.041236 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-sb\") pod \"44607811-622c-4f18-a414-0094958a5084\" (UID: \"44607811-622c-4f18-a414-0094958a5084\") " Mar 19 09:50:37.041629 master-0 kubenswrapper[15202]: I0319 09:50:37.041602 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041678 master-0 kubenswrapper[15202]: I0319 09:50:37.041649 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041723 master-0 kubenswrapper[15202]: I0319 09:50:37.041680 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041759 master-0 kubenswrapper[15202]: I0319 09:50:37.041734 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94vk\" (UniqueName: \"kubernetes.io/projected/20a9e839-7eb3-4ba6-bc63-7220be59d238-kube-api-access-k94vk\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041833 master-0 kubenswrapper[15202]: I0319 09:50:37.041788 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041876 master-0 kubenswrapper[15202]: I0319 09:50:37.041831 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041876 master-0 kubenswrapper[15202]: I0319 09:50:37.041871 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.041939 master-0 kubenswrapper[15202]: I0319 09:50:37.041920 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.067549 master-0 kubenswrapper[15202]: I0319 09:50:37.067445 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:37.068104 master-0 kubenswrapper[15202]: I0319 09:50:37.067990 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44607811-622c-4f18-a414-0094958a5084-kube-api-access-76gc4" (OuterVolumeSpecName: "kube-api-access-76gc4") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "kube-api-access-76gc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:37.093114 master-0 kubenswrapper[15202]: I0319 09:50:37.092933 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:37.093760 master-0 kubenswrapper[15202]: I0319 09:50:37.093672 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:37.093945 master-0 kubenswrapper[15202]: I0319 09:50:37.093914 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-config" (OuterVolumeSpecName: "config") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:37.094427 master-0 kubenswrapper[15202]: I0319 09:50:37.094391 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:37.095169 master-0 kubenswrapper[15202]: I0319 09:50:37.095087 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44607811-622c-4f18-a414-0094958a5084" (UID: "44607811-622c-4f18-a414-0094958a5084"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:37.114282 master-0 kubenswrapper[15202]: E0319 09:50:37.114233 15202 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586b7f66_ec6d_4168_bb8a_87b3c04d82ce.slice\": RecentStats: unable to find data in memory cache]" Mar 19 09:50:37.145389 master-0 kubenswrapper[15202]: I0319 09:50:37.145302 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145665 master-0 kubenswrapper[15202]: I0319 09:50:37.145403 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145665 master-0 kubenswrapper[15202]: I0319 09:50:37.145443 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145665 master-0 kubenswrapper[15202]: I0319 09:50:37.145488 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145665 master-0 kubenswrapper[15202]: I0319 09:50:37.145544 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94vk\" (UniqueName: \"kubernetes.io/projected/20a9e839-7eb3-4ba6-bc63-7220be59d238-kube-api-access-k94vk\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145665 master-0 kubenswrapper[15202]: I0319 09:50:37.145608 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145850 master-0 kubenswrapper[15202]: I0319 09:50:37.145762 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.145850 master-0 kubenswrapper[15202]: I0319 09:50:37.145815 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145905 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145921 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145937 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145946 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145957 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145971 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76gc4\" (UniqueName: \"kubernetes.io/projected/44607811-622c-4f18-a414-0094958a5084-kube-api-access-76gc4\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.146112 master-0 kubenswrapper[15202]: I0319 09:50:37.145980 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/44607811-622c-4f18-a414-0094958a5084-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:37.151731 master-0 kubenswrapper[15202]: I0319 09:50:37.149414 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.151731 master-0 kubenswrapper[15202]: I0319 09:50:37.149983 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.152132 master-0 kubenswrapper[15202]: I0319 09:50:37.152081 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.156313 master-0 kubenswrapper[15202]: I0319 09:50:37.155056 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.156313 master-0 kubenswrapper[15202]: I0319 09:50:37.154549 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:50:37.159672 master-0 kubenswrapper[15202]: I0319 09:50:37.159297 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/761159666a2fdb4e1d7229cd039b70780d5ca1904241b607263d6bd54bcba60c/globalmount\"" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.160447 master-0 kubenswrapper[15202]: I0319 09:50:37.160387 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.168611 master-0 kubenswrapper[15202]: I0319 09:50:37.168561 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.172865 master-0 kubenswrapper[15202]: I0319 09:50:37.172508 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94vk\" (UniqueName: \"kubernetes.io/projected/20a9e839-7eb3-4ba6-bc63-7220be59d238-kube-api-access-k94vk\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:37.359893 master-0 kubenswrapper[15202]: I0319 09:50:37.359725 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" event={"ID":"76d834e7-4d24-4e34-8ebd-b71c80766e40","Type":"ContainerStarted","Data":"9941eb25adc16021af551ad625b756cead27849d221b0f4deee8036d26ddd3fa"} Mar 19 09:50:37.362025 master-0 kubenswrapper[15202]: I0319 09:50:37.360890 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:37.367954 master-0 kubenswrapper[15202]: I0319 09:50:37.367063 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" event={"ID":"44607811-622c-4f18-a414-0094958a5084","Type":"ContainerDied","Data":"998d7640fe6cc091ff35cf9cd43076700a92c870d6814de45e7f4793ade9d372"} Mar 19 09:50:37.367954 master-0 kubenswrapper[15202]: I0319 09:50:37.367186 15202 scope.go:117] "RemoveContainer" containerID="ac8dab71c2aaf447bb4eda7d2f77476bf3deec222b02c647bd2c5eca4c24e7be" Mar 19 09:50:37.367954 master-0 kubenswrapper[15202]: I0319 09:50:37.367271 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8f46bbdf-cnrwt" Mar 19 09:50:37.367954 master-0 kubenswrapper[15202]: I0319 09:50:37.367422 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.425769 master-0 kubenswrapper[15202]: I0319 09:50:37.420790 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" podStartSLOduration=7.420768028 podStartE2EDuration="7.420768028s" podCreationTimestamp="2026-03-19 09:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:37.393389872 +0000 UTC m=+1554.778804688" watchObservedRunningTime="2026-03-19 09:50:37.420768028 +0000 UTC m=+1554.806182844" Mar 19 09:50:37.499493 master-0 kubenswrapper[15202]: I0319 09:50:37.497436 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:37.515023 master-0 kubenswrapper[15202]: I0319 09:50:37.514845 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:37.548493 master-0 kubenswrapper[15202]: I0319 09:50:37.546549 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:37.558402 master-0 kubenswrapper[15202]: I0319 09:50:37.551644 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.558402 master-0 kubenswrapper[15202]: I0319 09:50:37.555020 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 09:50:37.574551 master-0 kubenswrapper[15202]: I0319 09:50:37.570257 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-default-internal-config-data" Mar 19 09:50:37.593450 master-0 kubenswrapper[15202]: I0319 09:50:37.590698 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8f46bbdf-cnrwt"] Mar 19 09:50:37.628329 master-0 kubenswrapper[15202]: I0319 09:50:37.628130 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:37.646578 master-0 kubenswrapper[15202]: I0319 09:50:37.645594 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d8f46bbdf-cnrwt"] Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.675845 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.675952 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.675993 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.676308 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.676564 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6crtb\" (UniqueName: \"kubernetes.io/projected/496c49f4-9bde-41e5-ab83-477abcf1c5ef-kube-api-access-6crtb\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.676711 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.676883 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.677911 master-0 kubenswrapper[15202]: I0319 09:50:37.677014 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.778850 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.778976 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779059 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779107 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779506 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779565 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779829 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779894 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6crtb\" (UniqueName: \"kubernetes.io/projected/496c49f4-9bde-41e5-ab83-477abcf1c5ef-kube-api-access-6crtb\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.779952 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.780865 master-0 kubenswrapper[15202]: I0319 09:50:37.780028 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.784266 master-0 kubenswrapper[15202]: I0319 09:50:37.784029 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.785344 master-0 kubenswrapper[15202]: I0319 09:50:37.785226 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:50:37.785344 master-0 kubenswrapper[15202]: I0319 09:50:37.785304 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4408f40dfb1603f9af45ac684ff95accfb01fbfdee7d66269d29c583d6626950/globalmount\"" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.785511 master-0 kubenswrapper[15202]: I0319 09:50:37.785386 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.789490 master-0 kubenswrapper[15202]: I0319 09:50:37.787772 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.789490 master-0 kubenswrapper[15202]: I0319 09:50:37.788228 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:37.798450 master-0 kubenswrapper[15202]: I0319 09:50:37.797601 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6crtb\" (UniqueName: \"kubernetes.io/projected/496c49f4-9bde-41e5-ab83-477abcf1c5ef-kube-api-access-6crtb\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:38.060554 master-0 kubenswrapper[15202]: I0319 09:50:38.060492 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:38.185877 master-0 kubenswrapper[15202]: I0319 09:50:38.185500 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:38.806736 master-0 kubenswrapper[15202]: I0319 09:50:38.805519 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:50:38.842851 master-0 kubenswrapper[15202]: I0319 09:50:38.842268 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44607811-622c-4f18-a414-0094958a5084" path="/var/lib/kubelet/pods/44607811-622c-4f18-a414-0094958a5084/volumes" Mar 19 09:50:38.843967 master-0 kubenswrapper[15202]: I0319 09:50:38.843679 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586b7f66-ec6d-4168-bb8a-87b3c04d82ce" path="/var/lib/kubelet/pods/586b7f66-ec6d-4168-bb8a-87b3c04d82ce/volumes" Mar 19 09:50:38.847665 master-0 kubenswrapper[15202]: I0319 09:50:38.845172 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec4577f9-acc7-4637-b86b-4e1f63f3b477" path="/var/lib/kubelet/pods/ec4577f9-acc7-4637-b86b-4e1f63f3b477/volumes" Mar 19 09:50:39.576937 master-0 kubenswrapper[15202]: I0319 09:50:39.576831 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:39.714356 master-0 kubenswrapper[15202]: I0319 09:50:39.714283 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:41.491622 master-0 kubenswrapper[15202]: I0319 09:50:41.491554 15202 generic.go:334] "Generic (PLEG): container finished" podID="3d687cdb-7807-4089-8c5b-a840cfa6531e" containerID="c52b2aa7760d01a923982628c21e81cd4bdbb7bcc63829779172f81c54b5d42a" exitCode=0 Mar 19 09:50:41.492256 master-0 kubenswrapper[15202]: I0319 09:50:41.491620 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlqwd" event={"ID":"3d687cdb-7807-4089-8c5b-a840cfa6531e","Type":"ContainerDied","Data":"c52b2aa7760d01a923982628c21e81cd4bdbb7bcc63829779172f81c54b5d42a"} Mar 19 09:50:41.496894 master-0 kubenswrapper[15202]: I0319 09:50:41.496848 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"20a9e839-7eb3-4ba6-bc63-7220be59d238","Type":"ContainerStarted","Data":"66072e97e8c03addc037a16bdd832c64e2ef534fcdab06a5df8ead3d9c8d05fb"} Mar 19 09:50:41.497026 master-0 kubenswrapper[15202]: I0319 09:50:41.496898 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"20a9e839-7eb3-4ba6-bc63-7220be59d238","Type":"ContainerStarted","Data":"ecce5544b338800bfb03c9c629211eb5feb9d9a1715988c3650175f18c5ef6bc"} Mar 19 09:50:42.300091 master-0 kubenswrapper[15202]: I0319 09:50:42.300031 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:50:42.517847 master-0 kubenswrapper[15202]: I0319 09:50:42.515655 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"496c49f4-9bde-41e5-ab83-477abcf1c5ef","Type":"ContainerStarted","Data":"8cd4a37f5217605fe5838493834b38280a37aebc751efd2af6bd248cd4427f3a"} Mar 19 09:50:42.521564 master-0 kubenswrapper[15202]: I0319 09:50:42.521113 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2flmr" event={"ID":"85ab6d34-c24f-4e22-ac73-939b5a791240","Type":"ContainerStarted","Data":"3efb915151e6ca3740a776363e7453d24670d0b9cd300ddafeee1f941012cc5f"} Mar 19 09:50:42.525502 master-0 kubenswrapper[15202]: I0319 09:50:42.525279 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"20a9e839-7eb3-4ba6-bc63-7220be59d238","Type":"ContainerStarted","Data":"369ad3b9f0e4ef9c96b7b0fdf9c622246a6c2506dd02977cc69001dd6fe027e7"} Mar 19 09:50:43.089606 master-0 kubenswrapper[15202]: I0319 09:50:43.089090 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:43.238976 master-0 kubenswrapper[15202]: I0319 09:50:43.238627 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2flmr" podStartSLOduration=6.630215661 podStartE2EDuration="13.238563882s" podCreationTimestamp="2026-03-19 09:50:30 +0000 UTC" firstStartedPulling="2026-03-19 09:50:35.04474194 +0000 UTC m=+1552.430156756" lastFinishedPulling="2026-03-19 09:50:41.653090161 +0000 UTC m=+1559.038504977" observedRunningTime="2026-03-19 09:50:43.235835065 +0000 UTC m=+1560.621249881" watchObservedRunningTime="2026-03-19 09:50:43.238563882 +0000 UTC m=+1560.623978698" Mar 19 09:50:43.288295 master-0 kubenswrapper[15202]: I0319 09:50:43.288237 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-config-data\") pod \"3d687cdb-7807-4089-8c5b-a840cfa6531e\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " Mar 19 09:50:43.288528 master-0 kubenswrapper[15202]: I0319 09:50:43.288368 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-scripts\") pod \"3d687cdb-7807-4089-8c5b-a840cfa6531e\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " Mar 19 09:50:43.288631 master-0 kubenswrapper[15202]: I0319 09:50:43.288583 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-fernet-keys\") pod \"3d687cdb-7807-4089-8c5b-a840cfa6531e\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " Mar 19 09:50:43.288714 master-0 kubenswrapper[15202]: I0319 09:50:43.288640 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-combined-ca-bundle\") pod \"3d687cdb-7807-4089-8c5b-a840cfa6531e\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " Mar 19 09:50:43.288714 master-0 kubenswrapper[15202]: I0319 09:50:43.288682 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-credential-keys\") pod \"3d687cdb-7807-4089-8c5b-a840cfa6531e\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " Mar 19 09:50:43.288803 master-0 kubenswrapper[15202]: I0319 09:50:43.288775 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pft9k\" (UniqueName: \"kubernetes.io/projected/3d687cdb-7807-4089-8c5b-a840cfa6531e-kube-api-access-pft9k\") pod \"3d687cdb-7807-4089-8c5b-a840cfa6531e\" (UID: \"3d687cdb-7807-4089-8c5b-a840cfa6531e\") " Mar 19 09:50:43.292041 master-0 kubenswrapper[15202]: I0319 09:50:43.291584 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d687cdb-7807-4089-8c5b-a840cfa6531e" (UID: "3d687cdb-7807-4089-8c5b-a840cfa6531e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:43.293012 master-0 kubenswrapper[15202]: I0319 09:50:43.292958 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3d687cdb-7807-4089-8c5b-a840cfa6531e" (UID: "3d687cdb-7807-4089-8c5b-a840cfa6531e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:43.299439 master-0 kubenswrapper[15202]: I0319 09:50:43.299380 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-scripts" (OuterVolumeSpecName: "scripts") pod "3d687cdb-7807-4089-8c5b-a840cfa6531e" (UID: "3d687cdb-7807-4089-8c5b-a840cfa6531e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:43.299682 master-0 kubenswrapper[15202]: I0319 09:50:43.299649 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d687cdb-7807-4089-8c5b-a840cfa6531e-kube-api-access-pft9k" (OuterVolumeSpecName: "kube-api-access-pft9k") pod "3d687cdb-7807-4089-8c5b-a840cfa6531e" (UID: "3d687cdb-7807-4089-8c5b-a840cfa6531e"). InnerVolumeSpecName "kube-api-access-pft9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:43.324590 master-0 kubenswrapper[15202]: I0319 09:50:43.324530 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-config-data" (OuterVolumeSpecName: "config-data") pod "3d687cdb-7807-4089-8c5b-a840cfa6531e" (UID: "3d687cdb-7807-4089-8c5b-a840cfa6531e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:43.343226 master-0 kubenswrapper[15202]: I0319 09:50:43.343090 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d687cdb-7807-4089-8c5b-a840cfa6531e" (UID: "3d687cdb-7807-4089-8c5b-a840cfa6531e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:43.394870 master-0 kubenswrapper[15202]: I0319 09:50:43.394744 15202 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:43.394870 master-0 kubenswrapper[15202]: I0319 09:50:43.394789 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:43.394870 master-0 kubenswrapper[15202]: I0319 09:50:43.394800 15202 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:43.394870 master-0 kubenswrapper[15202]: I0319 09:50:43.394810 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pft9k\" (UniqueName: \"kubernetes.io/projected/3d687cdb-7807-4089-8c5b-a840cfa6531e-kube-api-access-pft9k\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:43.394870 master-0 kubenswrapper[15202]: I0319 09:50:43.394820 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:43.394870 master-0 kubenswrapper[15202]: I0319 09:50:43.394828 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d687cdb-7807-4089-8c5b-a840cfa6531e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:50:43.539969 master-0 kubenswrapper[15202]: I0319 09:50:43.539908 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"496c49f4-9bde-41e5-ab83-477abcf1c5ef","Type":"ContainerStarted","Data":"98854ebdc204abcd4cff2dac5ca4f7526aadfe02d73a0ff60ff32c560e6899db"} Mar 19 09:50:43.542209 master-0 kubenswrapper[15202]: I0319 09:50:43.542001 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlqwd" event={"ID":"3d687cdb-7807-4089-8c5b-a840cfa6531e","Type":"ContainerDied","Data":"23094c261997bb4a3a38c266cd7b551df577ad129dfa25f98c58b4ae9533abc8"} Mar 19 09:50:43.542209 master-0 kubenswrapper[15202]: I0319 09:50:43.542044 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23094c261997bb4a3a38c266cd7b551df577ad129dfa25f98c58b4ae9533abc8" Mar 19 09:50:43.542209 master-0 kubenswrapper[15202]: I0319 09:50:43.542151 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlqwd" Mar 19 09:50:43.639372 master-0 kubenswrapper[15202]: I0319 09:50:43.639221 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3a5fd-default-external-api-0" podStartSLOduration=7.639200977 podStartE2EDuration="7.639200977s" podCreationTimestamp="2026-03-19 09:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:43.620935737 +0000 UTC m=+1561.006350583" watchObservedRunningTime="2026-03-19 09:50:43.639200977 +0000 UTC m=+1561.024615793" Mar 19 09:50:44.555247 master-0 kubenswrapper[15202]: I0319 09:50:44.555184 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"496c49f4-9bde-41e5-ab83-477abcf1c5ef","Type":"ContainerStarted","Data":"df5424396812996687c783a3b69bbe79ddcfc74205891ffc7ddd501a9b5f7d01"} Mar 19 09:50:45.854151 master-0 kubenswrapper[15202]: I0319 09:50:45.852626 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:50:47.176152 master-0 kubenswrapper[15202]: I0319 09:50:47.176091 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hlqwd"] Mar 19 09:50:47.206919 master-0 kubenswrapper[15202]: I0319 09:50:47.206850 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hlqwd"] Mar 19 09:50:47.682605 master-0 kubenswrapper[15202]: I0319 09:50:47.682452 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3a5fd-default-internal-api-0" podStartSLOduration=10.68242595 podStartE2EDuration="10.68242595s" podCreationTimestamp="2026-03-19 09:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:47.671347357 +0000 UTC m=+1565.056762183" watchObservedRunningTime="2026-03-19 09:50:47.68242595 +0000 UTC m=+1565.067840776" Mar 19 09:50:47.861088 master-0 kubenswrapper[15202]: I0319 09:50:47.860971 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9748bd58f-s2fbq"] Mar 19 09:50:47.861407 master-0 kubenswrapper[15202]: I0319 09:50:47.861366 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="dnsmasq-dns" containerID="cri-o://a44cdcbb716944e9e26b3e2360e1a17368c0d236d50cf6c35c4f60a21d2a6ba0" gracePeriod=10 Mar 19 09:50:48.186553 master-0 kubenswrapper[15202]: I0319 09:50:48.186484 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:48.187743 master-0 kubenswrapper[15202]: I0319 09:50:48.187717 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:48.217696 master-0 kubenswrapper[15202]: I0319 09:50:48.217640 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:48.229682 master-0 kubenswrapper[15202]: I0319 09:50:48.229635 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:48.309427 master-0 kubenswrapper[15202]: I0319 09:50:48.309004 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xl426"] Mar 19 09:50:48.312529 master-0 kubenswrapper[15202]: E0319 09:50:48.310060 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d687cdb-7807-4089-8c5b-a840cfa6531e" containerName="keystone-bootstrap" Mar 19 09:50:48.312529 master-0 kubenswrapper[15202]: I0319 09:50:48.310094 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d687cdb-7807-4089-8c5b-a840cfa6531e" containerName="keystone-bootstrap" Mar 19 09:50:48.312529 master-0 kubenswrapper[15202]: I0319 09:50:48.310753 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d687cdb-7807-4089-8c5b-a840cfa6531e" containerName="keystone-bootstrap" Mar 19 09:50:48.317528 master-0 kubenswrapper[15202]: I0319 09:50:48.317488 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.321151 master-0 kubenswrapper[15202]: I0319 09:50:48.321114 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:50:48.321542 master-0 kubenswrapper[15202]: I0319 09:50:48.321459 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:50:48.321811 master-0 kubenswrapper[15202]: I0319 09:50:48.321780 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 09:50:48.322345 master-0 kubenswrapper[15202]: I0319 09:50:48.322313 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:50:48.377385 master-0 kubenswrapper[15202]: I0319 09:50:48.376655 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xl426"] Mar 19 09:50:48.609786 master-0 kubenswrapper[15202]: I0319 09:50:48.609236 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:48.609786 master-0 kubenswrapper[15202]: I0319 09:50:48.609283 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:48.753445 master-0 kubenswrapper[15202]: I0319 09:50:48.753309 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-combined-ca-bundle\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.753445 master-0 kubenswrapper[15202]: I0319 09:50:48.753388 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-credential-keys\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.753445 master-0 kubenswrapper[15202]: I0319 09:50:48.753446 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-scripts\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.753778 master-0 kubenswrapper[15202]: I0319 09:50:48.753641 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-fernet-keys\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.753878 master-0 kubenswrapper[15202]: I0319 09:50:48.753851 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-config-data\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.753951 master-0 kubenswrapper[15202]: I0319 09:50:48.753927 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkl2b\" (UniqueName: \"kubernetes.io/projected/d8782ab3-387f-49a1-94ae-46ba9f1e4241-kube-api-access-lkl2b\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.828214 master-0 kubenswrapper[15202]: I0319 09:50:48.828149 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d687cdb-7807-4089-8c5b-a840cfa6531e" path="/var/lib/kubelet/pods/3d687cdb-7807-4089-8c5b-a840cfa6531e/volumes" Mar 19 09:50:48.857959 master-0 kubenswrapper[15202]: I0319 09:50:48.857681 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-combined-ca-bundle\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.857959 master-0 kubenswrapper[15202]: I0319 09:50:48.857746 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-credential-keys\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.858898 master-0 kubenswrapper[15202]: I0319 09:50:48.858095 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-scripts\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.858898 master-0 kubenswrapper[15202]: I0319 09:50:48.858256 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-fernet-keys\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.858898 master-0 kubenswrapper[15202]: I0319 09:50:48.858393 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-config-data\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.858898 master-0 kubenswrapper[15202]: I0319 09:50:48.858445 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkl2b\" (UniqueName: \"kubernetes.io/projected/d8782ab3-387f-49a1-94ae-46ba9f1e4241-kube-api-access-lkl2b\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.873218 master-0 kubenswrapper[15202]: I0319 09:50:48.873140 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-config-data\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.874164 master-0 kubenswrapper[15202]: I0319 09:50:48.873910 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-fernet-keys\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.874164 master-0 kubenswrapper[15202]: I0319 09:50:48.873915 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-scripts\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.874164 master-0 kubenswrapper[15202]: I0319 09:50:48.873965 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-credential-keys\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:48.875141 master-0 kubenswrapper[15202]: I0319 09:50:48.873099 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-combined-ca-bundle\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:49.037431 master-0 kubenswrapper[15202]: I0319 09:50:49.037260 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkl2b\" (UniqueName: \"kubernetes.io/projected/d8782ab3-387f-49a1-94ae-46ba9f1e4241-kube-api-access-lkl2b\") pod \"keystone-bootstrap-xl426\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:49.242536 master-0 kubenswrapper[15202]: I0319 09:50:49.242445 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl426" Mar 19 09:50:49.601257 master-0 kubenswrapper[15202]: I0319 09:50:49.601185 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ddd7f485-2r6bg"] Mar 19 09:50:49.605532 master-0 kubenswrapper[15202]: I0319 09:50:49.605509 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.610647 master-0 kubenswrapper[15202]: I0319 09:50:49.608523 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-b" Mar 19 09:50:49.716129 master-0 kubenswrapper[15202]: I0319 09:50:49.715993 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:49.716129 master-0 kubenswrapper[15202]: I0319 09:50:49.716086 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:49.752000 master-0 kubenswrapper[15202]: I0319 09:50:49.751936 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:49.767830 master-0 kubenswrapper[15202]: I0319 09:50:49.767785 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:49.856556 master-0 kubenswrapper[15202]: I0319 09:50:49.856173 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ddd7f485-2r6bg"] Mar 19 09:50:49.912496 master-0 kubenswrapper[15202]: I0319 09:50:49.912234 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-svc\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.912496 master-0 kubenswrapper[15202]: I0319 09:50:49.912300 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-b\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.912496 master-0 kubenswrapper[15202]: I0319 09:50:49.912447 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.912871 master-0 kubenswrapper[15202]: I0319 09:50:49.912561 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.912871 master-0 kubenswrapper[15202]: I0319 09:50:49.912606 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-a\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.912871 master-0 kubenswrapper[15202]: I0319 09:50:49.912650 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.917251 master-0 kubenswrapper[15202]: I0319 09:50:49.913643 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-config\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:49.917251 master-0 kubenswrapper[15202]: I0319 09:50:49.913714 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9jq\" (UniqueName: \"kubernetes.io/projected/1eac8153-6f6c-45de-a444-df3bfae897d1-kube-api-access-4g9jq\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022137 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-config\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022192 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9jq\" (UniqueName: \"kubernetes.io/projected/1eac8153-6f6c-45de-a444-df3bfae897d1-kube-api-access-4g9jq\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022268 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-svc\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022330 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-b\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022561 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022593 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022651 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-a\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.024885 master-0 kubenswrapper[15202]: I0319 09:50:50.022689 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.025459 master-0 kubenswrapper[15202]: I0319 09:50:50.024938 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-sb\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.044803 master-0 kubenswrapper[15202]: I0319 09:50:50.026025 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-config\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.044803 master-0 kubenswrapper[15202]: I0319 09:50:50.027657 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-swift-storage-0\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.044803 master-0 kubenswrapper[15202]: I0319 09:50:50.028492 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-nb\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.044803 master-0 kubenswrapper[15202]: I0319 09:50:50.028679 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-a\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.044803 master-0 kubenswrapper[15202]: I0319 09:50:50.029369 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-svc\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.044803 master-0 kubenswrapper[15202]: I0319 09:50:50.031832 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-b\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.134553 master-0 kubenswrapper[15202]: I0319 09:50:50.132108 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9jq\" (UniqueName: \"kubernetes.io/projected/1eac8153-6f6c-45de-a444-df3bfae897d1-kube-api-access-4g9jq\") pod \"dnsmasq-dns-6ddd7f485-2r6bg\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.249878 master-0 kubenswrapper[15202]: I0319 09:50:50.249802 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:50:50.638940 master-0 kubenswrapper[15202]: I0319 09:50:50.638777 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:50.638940 master-0 kubenswrapper[15202]: I0319 09:50:50.638841 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:51.273620 master-0 kubenswrapper[15202]: I0319 09:50:51.273009 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm"] Mar 19 09:50:51.275812 master-0 kubenswrapper[15202]: I0319 09:50:51.275476 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.279956 master-0 kubenswrapper[15202]: I0319 09:50:51.279023 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-b-provisionserver-httpd-config" Mar 19 09:50:51.358547 master-0 kubenswrapper[15202]: I0319 09:50:51.358323 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrlg\" (UniqueName: \"kubernetes.io/projected/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-kube-api-access-gqrlg\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.358547 master-0 kubenswrapper[15202]: I0319 09:50:51.358396 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-httpd-config\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.358547 master-0 kubenswrapper[15202]: I0319 09:50:51.358495 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-image-data\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.460930 master-0 kubenswrapper[15202]: I0319 09:50:51.460870 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-httpd-config\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.461168 master-0 kubenswrapper[15202]: I0319 09:50:51.460973 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-image-data\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.461168 master-0 kubenswrapper[15202]: I0319 09:50:51.461110 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrlg\" (UniqueName: \"kubernetes.io/projected/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-kube-api-access-gqrlg\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.461426 master-0 kubenswrapper[15202]: I0319 09:50:51.461396 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-image-data\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.461897 master-0 kubenswrapper[15202]: I0319 09:50:51.461874 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-httpd-config\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.476637 master-0 kubenswrapper[15202]: I0319 09:50:51.476593 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrlg\" (UniqueName: \"kubernetes.io/projected/e2d72e05-7738-45d4-8b7a-2bfdb439e7f5-kube-api-access-gqrlg\") pod \"edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm\" (UID: \"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:51.609990 master-0 kubenswrapper[15202]: I0319 09:50:51.609015 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:50:52.619769 master-0 kubenswrapper[15202]: I0319 09:50:52.619715 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:52.620311 master-0 kubenswrapper[15202]: I0319 09:50:52.619818 15202 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:50:52.637619 master-0 kubenswrapper[15202]: I0319 09:50:52.637155 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:50:52.693610 master-0 kubenswrapper[15202]: I0319 09:50:52.693515 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.202:5353: connect: connection refused" Mar 19 09:50:53.697298 master-0 kubenswrapper[15202]: I0319 09:50:53.697241 15202 generic.go:334] "Generic (PLEG): container finished" podID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerID="a44cdcbb716944e9e26b3e2360e1a17368c0d236d50cf6c35c4f60a21d2a6ba0" exitCode=0 Mar 19 09:50:53.698172 master-0 kubenswrapper[15202]: I0319 09:50:53.697319 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" event={"ID":"5c700b42-1e60-4ea7-9837-c7474f999c0b","Type":"ContainerDied","Data":"a44cdcbb716944e9e26b3e2360e1a17368c0d236d50cf6c35c4f60a21d2a6ba0"} Mar 19 09:50:53.699999 master-0 kubenswrapper[15202]: I0319 09:50:53.699959 15202 generic.go:334] "Generic (PLEG): container finished" podID="85ab6d34-c24f-4e22-ac73-939b5a791240" containerID="3efb915151e6ca3740a776363e7453d24670d0b9cd300ddafeee1f941012cc5f" exitCode=0 Mar 19 09:50:53.700078 master-0 kubenswrapper[15202]: I0319 09:50:53.700008 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2flmr" event={"ID":"85ab6d34-c24f-4e22-ac73-939b5a791240","Type":"ContainerDied","Data":"3efb915151e6ca3740a776363e7453d24670d0b9cd300ddafeee1f941012cc5f"} Mar 19 09:50:54.062620 master-0 kubenswrapper[15202]: I0319 09:50:54.062085 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:54.063061 master-0 kubenswrapper[15202]: I0319 09:50:54.062670 15202 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:50:54.091689 master-0 kubenswrapper[15202]: I0319 09:50:54.091641 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:50:57.721992 master-0 kubenswrapper[15202]: I0319 09:50:57.721901 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.202:5353: connect: connection refused" Mar 19 09:51:01.088266 master-0 kubenswrapper[15202]: I0319 09:51:01.088208 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2flmr" Mar 19 09:51:01.162329 master-0 kubenswrapper[15202]: I0319 09:51:01.161457 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ab6d34-c24f-4e22-ac73-939b5a791240-logs\") pod \"85ab6d34-c24f-4e22-ac73-939b5a791240\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " Mar 19 09:51:01.162329 master-0 kubenswrapper[15202]: I0319 09:51:01.161520 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-combined-ca-bundle\") pod \"85ab6d34-c24f-4e22-ac73-939b5a791240\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " Mar 19 09:51:01.162329 master-0 kubenswrapper[15202]: I0319 09:51:01.161741 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-scripts\") pod \"85ab6d34-c24f-4e22-ac73-939b5a791240\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " Mar 19 09:51:01.162329 master-0 kubenswrapper[15202]: I0319 09:51:01.161762 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-config-data\") pod \"85ab6d34-c24f-4e22-ac73-939b5a791240\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " Mar 19 09:51:01.162329 master-0 kubenswrapper[15202]: I0319 09:51:01.161873 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhdhl\" (UniqueName: \"kubernetes.io/projected/85ab6d34-c24f-4e22-ac73-939b5a791240-kube-api-access-jhdhl\") pod \"85ab6d34-c24f-4e22-ac73-939b5a791240\" (UID: \"85ab6d34-c24f-4e22-ac73-939b5a791240\") " Mar 19 09:51:01.162698 master-0 kubenswrapper[15202]: I0319 09:51:01.162337 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ab6d34-c24f-4e22-ac73-939b5a791240-logs" (OuterVolumeSpecName: "logs") pod "85ab6d34-c24f-4e22-ac73-939b5a791240" (UID: "85ab6d34-c24f-4e22-ac73-939b5a791240"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:01.166936 master-0 kubenswrapper[15202]: I0319 09:51:01.166862 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ab6d34-c24f-4e22-ac73-939b5a791240-kube-api-access-jhdhl" (OuterVolumeSpecName: "kube-api-access-jhdhl") pod "85ab6d34-c24f-4e22-ac73-939b5a791240" (UID: "85ab6d34-c24f-4e22-ac73-939b5a791240"). InnerVolumeSpecName "kube-api-access-jhdhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:01.168142 master-0 kubenswrapper[15202]: I0319 09:51:01.167988 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-scripts" (OuterVolumeSpecName: "scripts") pod "85ab6d34-c24f-4e22-ac73-939b5a791240" (UID: "85ab6d34-c24f-4e22-ac73-939b5a791240"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:01.196885 master-0 kubenswrapper[15202]: I0319 09:51:01.195884 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85ab6d34-c24f-4e22-ac73-939b5a791240" (UID: "85ab6d34-c24f-4e22-ac73-939b5a791240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:01.196885 master-0 kubenswrapper[15202]: I0319 09:51:01.196252 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-config-data" (OuterVolumeSpecName: "config-data") pod "85ab6d34-c24f-4e22-ac73-939b5a791240" (UID: "85ab6d34-c24f-4e22-ac73-939b5a791240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:01.275100 master-0 kubenswrapper[15202]: I0319 09:51:01.274923 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhdhl\" (UniqueName: \"kubernetes.io/projected/85ab6d34-c24f-4e22-ac73-939b5a791240-kube-api-access-jhdhl\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.275100 master-0 kubenswrapper[15202]: I0319 09:51:01.274965 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ab6d34-c24f-4e22-ac73-939b5a791240-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.275100 master-0 kubenswrapper[15202]: I0319 09:51:01.274979 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.275100 master-0 kubenswrapper[15202]: I0319 09:51:01.274994 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.275100 master-0 kubenswrapper[15202]: I0319 09:51:01.275005 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ab6d34-c24f-4e22-ac73-939b5a791240-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.281758 master-0 kubenswrapper[15202]: I0319 09:51:01.281716 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:51:01.378380 master-0 kubenswrapper[15202]: I0319 09:51:01.377503 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-sb\") pod \"5c700b42-1e60-4ea7-9837-c7474f999c0b\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " Mar 19 09:51:01.378380 master-0 kubenswrapper[15202]: I0319 09:51:01.377689 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clvbn\" (UniqueName: \"kubernetes.io/projected/5c700b42-1e60-4ea7-9837-c7474f999c0b-kube-api-access-clvbn\") pod \"5c700b42-1e60-4ea7-9837-c7474f999c0b\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " Mar 19 09:51:01.378380 master-0 kubenswrapper[15202]: I0319 09:51:01.377733 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-config\") pod \"5c700b42-1e60-4ea7-9837-c7474f999c0b\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " Mar 19 09:51:01.378380 master-0 kubenswrapper[15202]: I0319 09:51:01.377790 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-svc\") pod \"5c700b42-1e60-4ea7-9837-c7474f999c0b\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " Mar 19 09:51:01.378380 master-0 kubenswrapper[15202]: I0319 09:51:01.377820 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-nb\") pod \"5c700b42-1e60-4ea7-9837-c7474f999c0b\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " Mar 19 09:51:01.378380 master-0 kubenswrapper[15202]: I0319 09:51:01.377875 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-swift-storage-0\") pod \"5c700b42-1e60-4ea7-9837-c7474f999c0b\" (UID: \"5c700b42-1e60-4ea7-9837-c7474f999c0b\") " Mar 19 09:51:01.390593 master-0 kubenswrapper[15202]: I0319 09:51:01.390509 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c700b42-1e60-4ea7-9837-c7474f999c0b-kube-api-access-clvbn" (OuterVolumeSpecName: "kube-api-access-clvbn") pod "5c700b42-1e60-4ea7-9837-c7474f999c0b" (UID: "5c700b42-1e60-4ea7-9837-c7474f999c0b"). InnerVolumeSpecName "kube-api-access-clvbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:01.406428 master-0 kubenswrapper[15202]: I0319 09:51:01.406365 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ddd7f485-2r6bg"] Mar 19 09:51:01.436210 master-0 kubenswrapper[15202]: I0319 09:51:01.436150 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c700b42-1e60-4ea7-9837-c7474f999c0b" (UID: "5c700b42-1e60-4ea7-9837-c7474f999c0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:01.457777 master-0 kubenswrapper[15202]: I0319 09:51:01.456693 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c700b42-1e60-4ea7-9837-c7474f999c0b" (UID: "5c700b42-1e60-4ea7-9837-c7474f999c0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:01.462603 master-0 kubenswrapper[15202]: I0319 09:51:01.462292 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c700b42-1e60-4ea7-9837-c7474f999c0b" (UID: "5c700b42-1e60-4ea7-9837-c7474f999c0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:01.473572 master-0 kubenswrapper[15202]: I0319 09:51:01.473508 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5c700b42-1e60-4ea7-9837-c7474f999c0b" (UID: "5c700b42-1e60-4ea7-9837-c7474f999c0b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:01.477239 master-0 kubenswrapper[15202]: I0319 09:51:01.477116 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-config" (OuterVolumeSpecName: "config") pod "5c700b42-1e60-4ea7-9837-c7474f999c0b" (UID: "5c700b42-1e60-4ea7-9837-c7474f999c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:01.483256 master-0 kubenswrapper[15202]: I0319 09:51:01.483190 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.483256 master-0 kubenswrapper[15202]: I0319 09:51:01.483247 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clvbn\" (UniqueName: \"kubernetes.io/projected/5c700b42-1e60-4ea7-9837-c7474f999c0b-kube-api-access-clvbn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.483504 master-0 kubenswrapper[15202]: I0319 09:51:01.483269 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.483504 master-0 kubenswrapper[15202]: I0319 09:51:01.483282 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.483504 master-0 kubenswrapper[15202]: I0319 09:51:01.483294 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.483504 master-0 kubenswrapper[15202]: I0319 09:51:01.483309 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c700b42-1e60-4ea7-9837-c7474f999c0b-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:01.525060 master-0 kubenswrapper[15202]: I0319 09:51:01.524998 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xl426"] Mar 19 09:51:01.825335 master-0 kubenswrapper[15202]: I0319 09:51:01.825272 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" event={"ID":"5c700b42-1e60-4ea7-9837-c7474f999c0b","Type":"ContainerDied","Data":"faa3f4e96a853131332c6e18bec61e1381293ac570d597ce371817af8cc2477f"} Mar 19 09:51:01.825583 master-0 kubenswrapper[15202]: I0319 09:51:01.825360 15202 scope.go:117] "RemoveContainer" containerID="a44cdcbb716944e9e26b3e2360e1a17368c0d236d50cf6c35c4f60a21d2a6ba0" Mar 19 09:51:01.825583 master-0 kubenswrapper[15202]: I0319 09:51:01.825301 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9748bd58f-s2fbq" Mar 19 09:51:01.827274 master-0 kubenswrapper[15202]: I0319 09:51:01.827085 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2flmr" Mar 19 09:51:01.827433 master-0 kubenswrapper[15202]: I0319 09:51:01.827373 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2flmr" event={"ID":"85ab6d34-c24f-4e22-ac73-939b5a791240","Type":"ContainerDied","Data":"d5d6734f49e45387f341c2c0bce19efa0917a0acbedddc64e04117c4628873c5"} Mar 19 09:51:01.827433 master-0 kubenswrapper[15202]: I0319 09:51:01.827412 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d6734f49e45387f341c2c0bce19efa0917a0acbedddc64e04117c4628873c5" Mar 19 09:51:01.843531 master-0 kubenswrapper[15202]: I0319 09:51:01.842716 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl426" event={"ID":"d8782ab3-387f-49a1-94ae-46ba9f1e4241","Type":"ContainerStarted","Data":"c38e638a8d646d2bdcd40ecea57b0e7de5a5b1776624134656b4f20ac203abd8"} Mar 19 09:51:01.843531 master-0 kubenswrapper[15202]: I0319 09:51:01.842757 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl426" event={"ID":"d8782ab3-387f-49a1-94ae-46ba9f1e4241","Type":"ContainerStarted","Data":"367518c1b6f7a035df5ad9c5970861b2a05cdabec40758b7aebb82def9cd89b6"} Mar 19 09:51:01.847783 master-0 kubenswrapper[15202]: I0319 09:51:01.847744 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" event={"ID":"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5","Type":"ContainerStarted","Data":"23a1cd11ba281aa2b2dc93ef6df7c1029555c6b40227f7a0c7f4b7723ac8d673"} Mar 19 09:51:01.847872 master-0 kubenswrapper[15202]: I0319 09:51:01.847825 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" event={"ID":"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5","Type":"ContainerStarted","Data":"00eaf6b33cf2738c89be784b287d3e5ca7dff3912a1a67beeaf0e418e1448fd1"} Mar 19 09:51:01.850978 master-0 kubenswrapper[15202]: I0319 09:51:01.850918 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" event={"ID":"9bf992bb-2aac-49c3-8135-6ab9f3a53193","Type":"ContainerStarted","Data":"94cd7910796db39c83a535c6f328d95df6627ee219c3c6f2621109460155f2a3"} Mar 19 09:51:01.852803 master-0 kubenswrapper[15202]: I0319 09:51:01.852584 15202 generic.go:334] "Generic (PLEG): container finished" podID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerID="61b2b3a855256cabafbc5c00692e31b7226804281bd2d9d3e0c960784e0d8c32" exitCode=0 Mar 19 09:51:01.852803 master-0 kubenswrapper[15202]: I0319 09:51:01.852665 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" event={"ID":"1eac8153-6f6c-45de-a444-df3bfae897d1","Type":"ContainerDied","Data":"61b2b3a855256cabafbc5c00692e31b7226804281bd2d9d3e0c960784e0d8c32"} Mar 19 09:51:01.852803 master-0 kubenswrapper[15202]: I0319 09:51:01.852686 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" event={"ID":"1eac8153-6f6c-45de-a444-df3bfae897d1","Type":"ContainerStarted","Data":"be37d8cfe535a81320a9735a22a2969ad787c4f4ebea62c4e6be3c61acc0bf9e"} Mar 19 09:51:01.895536 master-0 kubenswrapper[15202]: I0319 09:51:01.881922 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xl426" podStartSLOduration=14.881899867 podStartE2EDuration="14.881899867s" podCreationTimestamp="2026-03-19 09:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:01.862710984 +0000 UTC m=+1579.248125820" watchObservedRunningTime="2026-03-19 09:51:01.881899867 +0000 UTC m=+1579.267314683" Mar 19 09:51:02.083423 master-0 kubenswrapper[15202]: I0319 09:51:02.083352 15202 scope.go:117] "RemoveContainer" containerID="e81aa67c41b386c44d8f394d64776510502d57467e8fc1b7480f4aff25c1565f" Mar 19 09:51:02.409866 master-0 kubenswrapper[15202]: I0319 09:51:02.409635 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9748bd58f-s2fbq"] Mar 19 09:51:02.880806 master-0 kubenswrapper[15202]: I0319 09:51:02.880670 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-db-sync-jdc2m" event={"ID":"04e14c4d-4d08-4c3c-8803-d39b03125169","Type":"ContainerStarted","Data":"42ca44b5febf4f3dca1cd9cb5cb5b30c9ac65c7ccd22ac6082676dc8fbf27bd5"} Mar 19 09:51:02.884293 master-0 kubenswrapper[15202]: I0319 09:51:02.884233 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" event={"ID":"1eac8153-6f6c-45de-a444-df3bfae897d1","Type":"ContainerStarted","Data":"ccec4f13f40436d35baae2a4233753999f750ce51c44ebbc28ce4795ce325986"} Mar 19 09:51:03.094777 master-0 kubenswrapper[15202]: I0319 09:51:03.094706 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9748bd58f-s2fbq"] Mar 19 09:51:03.897728 master-0 kubenswrapper[15202]: I0319 09:51:03.897652 15202 generic.go:334] "Generic (PLEG): container finished" podID="972a5655-2953-4875-b9cd-2b5481c6ff30" containerID="62d661213ddc7402b16f43961ab0ddd88189e01cab778f58d3c4309a43d8ab8e" exitCode=0 Mar 19 09:51:03.898857 master-0 kubenswrapper[15202]: I0319 09:51:03.898830 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hbzpf" event={"ID":"972a5655-2953-4875-b9cd-2b5481c6ff30","Type":"ContainerDied","Data":"62d661213ddc7402b16f43961ab0ddd88189e01cab778f58d3c4309a43d8ab8e"} Mar 19 09:51:03.899455 master-0 kubenswrapper[15202]: I0319 09:51:03.899430 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:51:04.346122 master-0 kubenswrapper[15202]: I0319 09:51:04.346028 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" podStartSLOduration=15.346001966 podStartE2EDuration="15.346001966s" podCreationTimestamp="2026-03-19 09:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:04.315393711 +0000 UTC m=+1581.700808527" watchObservedRunningTime="2026-03-19 09:51:04.346001966 +0000 UTC m=+1581.731416792" Mar 19 09:51:04.837245 master-0 kubenswrapper[15202]: I0319 09:51:04.837170 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" path="/var/lib/kubelet/pods/5c700b42-1e60-4ea7-9837-c7474f999c0b/volumes" Mar 19 09:51:04.848725 master-0 kubenswrapper[15202]: I0319 09:51:04.848653 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-687479ff9d-8shw8"] Mar 19 09:51:04.849344 master-0 kubenswrapper[15202]: E0319 09:51:04.849316 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ab6d34-c24f-4e22-ac73-939b5a791240" containerName="placement-db-sync" Mar 19 09:51:04.849344 master-0 kubenswrapper[15202]: I0319 09:51:04.849341 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ab6d34-c24f-4e22-ac73-939b5a791240" containerName="placement-db-sync" Mar 19 09:51:04.849447 master-0 kubenswrapper[15202]: E0319 09:51:04.849373 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="dnsmasq-dns" Mar 19 09:51:04.849447 master-0 kubenswrapper[15202]: I0319 09:51:04.849380 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="dnsmasq-dns" Mar 19 09:51:04.849447 master-0 kubenswrapper[15202]: E0319 09:51:04.849406 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="init" Mar 19 09:51:04.849447 master-0 kubenswrapper[15202]: I0319 09:51:04.849413 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="init" Mar 19 09:51:04.849678 master-0 kubenswrapper[15202]: I0319 09:51:04.849654 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c700b42-1e60-4ea7-9837-c7474f999c0b" containerName="dnsmasq-dns" Mar 19 09:51:04.849732 master-0 kubenswrapper[15202]: I0319 09:51:04.849677 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ab6d34-c24f-4e22-ac73-939b5a791240" containerName="placement-db-sync" Mar 19 09:51:04.850874 master-0 kubenswrapper[15202]: I0319 09:51:04.850847 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.855820 master-0 kubenswrapper[15202]: I0319 09:51:04.855731 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-db-sync-jdc2m" podStartSLOduration=10.185065807 podStartE2EDuration="35.85571095s" podCreationTimestamp="2026-03-19 09:50:29 +0000 UTC" firstStartedPulling="2026-03-19 09:50:35.329319925 +0000 UTC m=+1552.714734741" lastFinishedPulling="2026-03-19 09:51:00.999965068 +0000 UTC m=+1578.385379884" observedRunningTime="2026-03-19 09:51:04.830134649 +0000 UTC m=+1582.215549465" watchObservedRunningTime="2026-03-19 09:51:04.85571095 +0000 UTC m=+1582.241125766" Mar 19 09:51:04.856878 master-0 kubenswrapper[15202]: I0319 09:51:04.856569 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 09:51:04.860530 master-0 kubenswrapper[15202]: I0319 09:51:04.857154 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 09:51:04.860530 master-0 kubenswrapper[15202]: I0319 09:51:04.857326 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 09:51:04.860530 master-0 kubenswrapper[15202]: I0319 09:51:04.857378 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 09:51:04.878212 master-0 kubenswrapper[15202]: I0319 09:51:04.878137 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-scripts\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.878212 master-0 kubenswrapper[15202]: I0319 09:51:04.878203 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-public-tls-certs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.878747 master-0 kubenswrapper[15202]: I0319 09:51:04.878257 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg7jk\" (UniqueName: \"kubernetes.io/projected/fa213423-98fc-446d-9208-33d884780995-kube-api-access-mg7jk\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.878747 master-0 kubenswrapper[15202]: I0319 09:51:04.878333 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa213423-98fc-446d-9208-33d884780995-logs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.878747 master-0 kubenswrapper[15202]: I0319 09:51:04.878385 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-combined-ca-bundle\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.878747 master-0 kubenswrapper[15202]: I0319 09:51:04.878500 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-internal-tls-certs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.878747 master-0 kubenswrapper[15202]: I0319 09:51:04.878570 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-config-data\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980024 master-0 kubenswrapper[15202]: I0319 09:51:04.979968 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-scripts\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980024 master-0 kubenswrapper[15202]: I0319 09:51:04.980030 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-public-tls-certs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980687 master-0 kubenswrapper[15202]: I0319 09:51:04.980085 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg7jk\" (UniqueName: \"kubernetes.io/projected/fa213423-98fc-446d-9208-33d884780995-kube-api-access-mg7jk\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980687 master-0 kubenswrapper[15202]: I0319 09:51:04.980126 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa213423-98fc-446d-9208-33d884780995-logs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980687 master-0 kubenswrapper[15202]: I0319 09:51:04.980144 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-combined-ca-bundle\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980687 master-0 kubenswrapper[15202]: I0319 09:51:04.980199 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-internal-tls-certs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.980687 master-0 kubenswrapper[15202]: I0319 09:51:04.980292 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-config-data\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.982755 master-0 kubenswrapper[15202]: I0319 09:51:04.982717 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa213423-98fc-446d-9208-33d884780995-logs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.985726 master-0 kubenswrapper[15202]: I0319 09:51:04.985652 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-scripts\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.985991 master-0 kubenswrapper[15202]: I0319 09:51:04.985930 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-public-tls-certs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.987953 master-0 kubenswrapper[15202]: I0319 09:51:04.987923 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-internal-tls-certs\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.991661 master-0 kubenswrapper[15202]: I0319 09:51:04.991631 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-config-data\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:04.992065 master-0 kubenswrapper[15202]: I0319 09:51:04.991998 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-combined-ca-bundle\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:05.399988 master-0 kubenswrapper[15202]: I0319 09:51:05.399911 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:51:05.488649 master-0 kubenswrapper[15202]: I0319 09:51:05.488566 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvpg\" (UniqueName: \"kubernetes.io/projected/972a5655-2953-4875-b9cd-2b5481c6ff30-kube-api-access-nfvpg\") pod \"972a5655-2953-4875-b9cd-2b5481c6ff30\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " Mar 19 09:51:05.488885 master-0 kubenswrapper[15202]: I0319 09:51:05.488740 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-config\") pod \"972a5655-2953-4875-b9cd-2b5481c6ff30\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " Mar 19 09:51:05.492909 master-0 kubenswrapper[15202]: I0319 09:51:05.492815 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972a5655-2953-4875-b9cd-2b5481c6ff30-kube-api-access-nfvpg" (OuterVolumeSpecName: "kube-api-access-nfvpg") pod "972a5655-2953-4875-b9cd-2b5481c6ff30" (UID: "972a5655-2953-4875-b9cd-2b5481c6ff30"). InnerVolumeSpecName "kube-api-access-nfvpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:05.514072 master-0 kubenswrapper[15202]: I0319 09:51:05.513995 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-config" (OuterVolumeSpecName: "config") pod "972a5655-2953-4875-b9cd-2b5481c6ff30" (UID: "972a5655-2953-4875-b9cd-2b5481c6ff30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:05.590257 master-0 kubenswrapper[15202]: I0319 09:51:05.590009 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-combined-ca-bundle\") pod \"972a5655-2953-4875-b9cd-2b5481c6ff30\" (UID: \"972a5655-2953-4875-b9cd-2b5481c6ff30\") " Mar 19 09:51:05.590849 master-0 kubenswrapper[15202]: I0319 09:51:05.590803 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:05.590849 master-0 kubenswrapper[15202]: I0319 09:51:05.590827 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfvpg\" (UniqueName: \"kubernetes.io/projected/972a5655-2953-4875-b9cd-2b5481c6ff30-kube-api-access-nfvpg\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:05.615873 master-0 kubenswrapper[15202]: I0319 09:51:05.615796 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "972a5655-2953-4875-b9cd-2b5481c6ff30" (UID: "972a5655-2953-4875-b9cd-2b5481c6ff30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:05.693442 master-0 kubenswrapper[15202]: I0319 09:51:05.693317 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a5655-2953-4875-b9cd-2b5481c6ff30-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:05.942739 master-0 kubenswrapper[15202]: I0319 09:51:05.942673 15202 generic.go:334] "Generic (PLEG): container finished" podID="d8782ab3-387f-49a1-94ae-46ba9f1e4241" containerID="c38e638a8d646d2bdcd40ecea57b0e7de5a5b1776624134656b4f20ac203abd8" exitCode=0 Mar 19 09:51:05.943037 master-0 kubenswrapper[15202]: I0319 09:51:05.942779 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl426" event={"ID":"d8782ab3-387f-49a1-94ae-46ba9f1e4241","Type":"ContainerDied","Data":"c38e638a8d646d2bdcd40ecea57b0e7de5a5b1776624134656b4f20ac203abd8"} Mar 19 09:51:05.945830 master-0 kubenswrapper[15202]: I0319 09:51:05.945762 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hbzpf" event={"ID":"972a5655-2953-4875-b9cd-2b5481c6ff30","Type":"ContainerDied","Data":"a0cfb7f46baa0485fd82421ca1b67a699d750cf13a02bc795d246030b201122e"} Mar 19 09:51:05.945946 master-0 kubenswrapper[15202]: I0319 09:51:05.945930 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0cfb7f46baa0485fd82421ca1b67a699d750cf13a02bc795d246030b201122e" Mar 19 09:51:05.946180 master-0 kubenswrapper[15202]: I0319 09:51:05.946163 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hbzpf" Mar 19 09:51:06.186867 master-0 kubenswrapper[15202]: I0319 09:51:06.185409 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687479ff9d-8shw8"] Mar 19 09:51:06.965135 master-0 kubenswrapper[15202]: I0319 09:51:06.965051 15202 generic.go:334] "Generic (PLEG): container finished" podID="9bf992bb-2aac-49c3-8135-6ab9f3a53193" containerID="94cd7910796db39c83a535c6f328d95df6627ee219c3c6f2621109460155f2a3" exitCode=0 Mar 19 09:51:06.965541 master-0 kubenswrapper[15202]: I0319 09:51:06.965144 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" event={"ID":"9bf992bb-2aac-49c3-8135-6ab9f3a53193","Type":"ContainerDied","Data":"94cd7910796db39c83a535c6f328d95df6627ee219c3c6f2621109460155f2a3"} Mar 19 09:51:07.337388 master-0 kubenswrapper[15202]: I0319 09:51:07.337185 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg7jk\" (UniqueName: \"kubernetes.io/projected/fa213423-98fc-446d-9208-33d884780995-kube-api-access-mg7jk\") pod \"placement-687479ff9d-8shw8\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:07.432414 master-0 kubenswrapper[15202]: I0319 09:51:07.432349 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl426" Mar 19 09:51:07.473711 master-0 kubenswrapper[15202]: I0319 09:51:07.473622 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-combined-ca-bundle\") pod \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " Mar 19 09:51:07.474022 master-0 kubenswrapper[15202]: I0319 09:51:07.473747 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkl2b\" (UniqueName: \"kubernetes.io/projected/d8782ab3-387f-49a1-94ae-46ba9f1e4241-kube-api-access-lkl2b\") pod \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " Mar 19 09:51:07.474022 master-0 kubenswrapper[15202]: I0319 09:51:07.473818 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-scripts\") pod \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " Mar 19 09:51:07.474022 master-0 kubenswrapper[15202]: I0319 09:51:07.473931 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-fernet-keys\") pod \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " Mar 19 09:51:07.474022 master-0 kubenswrapper[15202]: I0319 09:51:07.474003 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-config-data\") pod \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " Mar 19 09:51:07.474241 master-0 kubenswrapper[15202]: I0319 09:51:07.474068 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-credential-keys\") pod \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\" (UID: \"d8782ab3-387f-49a1-94ae-46ba9f1e4241\") " Mar 19 09:51:07.478321 master-0 kubenswrapper[15202]: I0319 09:51:07.478268 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8782ab3-387f-49a1-94ae-46ba9f1e4241-kube-api-access-lkl2b" (OuterVolumeSpecName: "kube-api-access-lkl2b") pod "d8782ab3-387f-49a1-94ae-46ba9f1e4241" (UID: "d8782ab3-387f-49a1-94ae-46ba9f1e4241"). InnerVolumeSpecName "kube-api-access-lkl2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:07.478609 master-0 kubenswrapper[15202]: I0319 09:51:07.478460 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d8782ab3-387f-49a1-94ae-46ba9f1e4241" (UID: "d8782ab3-387f-49a1-94ae-46ba9f1e4241"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:07.480797 master-0 kubenswrapper[15202]: I0319 09:51:07.480688 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d8782ab3-387f-49a1-94ae-46ba9f1e4241" (UID: "d8782ab3-387f-49a1-94ae-46ba9f1e4241"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:07.480994 master-0 kubenswrapper[15202]: I0319 09:51:07.480907 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-scripts" (OuterVolumeSpecName: "scripts") pod "d8782ab3-387f-49a1-94ae-46ba9f1e4241" (UID: "d8782ab3-387f-49a1-94ae-46ba9f1e4241"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:07.501264 master-0 kubenswrapper[15202]: I0319 09:51:07.501181 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8782ab3-387f-49a1-94ae-46ba9f1e4241" (UID: "d8782ab3-387f-49a1-94ae-46ba9f1e4241"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:07.503411 master-0 kubenswrapper[15202]: I0319 09:51:07.503293 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-config-data" (OuterVolumeSpecName: "config-data") pod "d8782ab3-387f-49a1-94ae-46ba9f1e4241" (UID: "d8782ab3-387f-49a1-94ae-46ba9f1e4241"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:07.571176 master-0 kubenswrapper[15202]: I0319 09:51:07.571108 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:07.576157 master-0 kubenswrapper[15202]: I0319 09:51:07.576071 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:07.576157 master-0 kubenswrapper[15202]: I0319 09:51:07.576136 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkl2b\" (UniqueName: \"kubernetes.io/projected/d8782ab3-387f-49a1-94ae-46ba9f1e4241-kube-api-access-lkl2b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:07.576157 master-0 kubenswrapper[15202]: I0319 09:51:07.576147 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:07.576157 master-0 kubenswrapper[15202]: I0319 09:51:07.576157 15202 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:07.576157 master-0 kubenswrapper[15202]: I0319 09:51:07.576166 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:07.576505 master-0 kubenswrapper[15202]: I0319 09:51:07.576175 15202 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8782ab3-387f-49a1-94ae-46ba9f1e4241-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:07.983709 master-0 kubenswrapper[15202]: I0319 09:51:07.983567 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xl426" event={"ID":"d8782ab3-387f-49a1-94ae-46ba9f1e4241","Type":"ContainerDied","Data":"367518c1b6f7a035df5ad9c5970861b2a05cdabec40758b7aebb82def9cd89b6"} Mar 19 09:51:07.983709 master-0 kubenswrapper[15202]: I0319 09:51:07.983618 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xl426" Mar 19 09:51:07.983709 master-0 kubenswrapper[15202]: I0319 09:51:07.983636 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="367518c1b6f7a035df5ad9c5970861b2a05cdabec40758b7aebb82def9cd89b6" Mar 19 09:51:07.986444 master-0 kubenswrapper[15202]: I0319 09:51:07.986390 15202 generic.go:334] "Generic (PLEG): container finished" podID="e2d72e05-7738-45d4-8b7a-2bfdb439e7f5" containerID="23a1cd11ba281aa2b2dc93ef6df7c1029555c6b40227f7a0c7f4b7723ac8d673" exitCode=0 Mar 19 09:51:07.986553 master-0 kubenswrapper[15202]: I0319 09:51:07.986442 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" event={"ID":"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5","Type":"ContainerDied","Data":"23a1cd11ba281aa2b2dc93ef6df7c1029555c6b40227f7a0c7f4b7723ac8d673"} Mar 19 09:51:08.937255 master-0 kubenswrapper[15202]: I0319 09:51:08.937191 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687479ff9d-8shw8"] Mar 19 09:51:08.997770 master-0 kubenswrapper[15202]: I0319 09:51:08.997517 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687479ff9d-8shw8" event={"ID":"fa213423-98fc-446d-9208-33d884780995","Type":"ContainerStarted","Data":"c856abfd7d7575ad1d1bd188cd9d13814b43bf73d84a00d443de688c4ef458ff"} Mar 19 09:51:09.757030 master-0 kubenswrapper[15202]: I0319 09:51:09.756705 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ddd7f485-2r6bg"] Mar 19 09:51:09.757338 master-0 kubenswrapper[15202]: I0319 09:51:09.757047 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="dnsmasq-dns" containerID="cri-o://ccec4f13f40436d35baae2a4233753999f750ce51c44ebbc28ce4795ce325986" gracePeriod=10 Mar 19 09:51:09.766025 master-0 kubenswrapper[15202]: I0319 09:51:09.765971 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:51:09.939449 master-0 kubenswrapper[15202]: I0319 09:51:09.939303 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849fd5d677-sdj8j"] Mar 19 09:51:09.947609 master-0 kubenswrapper[15202]: E0319 09:51:09.939982 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8782ab3-387f-49a1-94ae-46ba9f1e4241" containerName="keystone-bootstrap" Mar 19 09:51:09.947609 master-0 kubenswrapper[15202]: I0319 09:51:09.940001 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8782ab3-387f-49a1-94ae-46ba9f1e4241" containerName="keystone-bootstrap" Mar 19 09:51:09.947609 master-0 kubenswrapper[15202]: E0319 09:51:09.940029 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972a5655-2953-4875-b9cd-2b5481c6ff30" containerName="neutron-db-sync" Mar 19 09:51:09.947609 master-0 kubenswrapper[15202]: I0319 09:51:09.940035 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="972a5655-2953-4875-b9cd-2b5481c6ff30" containerName="neutron-db-sync" Mar 19 09:51:09.955703 master-0 kubenswrapper[15202]: I0319 09:51:09.955645 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="972a5655-2953-4875-b9cd-2b5481c6ff30" containerName="neutron-db-sync" Mar 19 09:51:09.955925 master-0 kubenswrapper[15202]: I0319 09:51:09.955722 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8782ab3-387f-49a1-94ae-46ba9f1e4241" containerName="keystone-bootstrap" Mar 19 09:51:09.971871 master-0 kubenswrapper[15202]: I0319 09:51:09.971793 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fd5d677-sdj8j"] Mar 19 09:51:09.981121 master-0 kubenswrapper[15202]: I0319 09:51:09.981071 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051677 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-b\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051743 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-nb\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051773 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgq8\" (UniqueName: \"kubernetes.io/projected/aa538e33-3e22-45d9-8109-6aaba8e9ee52-kube-api-access-9zgq8\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051810 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-swift-storage-0\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051837 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-a\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051891 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-config\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.051995 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-sb\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.052039 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-svc\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.061492 master-0 kubenswrapper[15202]: I0319 09:51:10.060616 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85f97d8d64-dfwgh"] Mar 19 09:51:10.078670 master-0 kubenswrapper[15202]: I0319 09:51:10.075964 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.086544 master-0 kubenswrapper[15202]: I0319 09:51:10.086386 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 09:51:10.086855 master-0 kubenswrapper[15202]: I0319 09:51:10.086826 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 09:51:10.107536 master-0 kubenswrapper[15202]: I0319 09:51:10.102672 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 09:51:10.118372 master-0 kubenswrapper[15202]: I0319 09:51:10.116618 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85f97d8d64-dfwgh"] Mar 19 09:51:10.166545 master-0 kubenswrapper[15202]: I0319 09:51:10.164400 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-combined-ca-bundle\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.176532 master-0 kubenswrapper[15202]: I0319 09:51:10.167664 15202 generic.go:334] "Generic (PLEG): container finished" podID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerID="ccec4f13f40436d35baae2a4233753999f750ce51c44ebbc28ce4795ce325986" exitCode=0 Mar 19 09:51:10.176532 master-0 kubenswrapper[15202]: I0319 09:51:10.167736 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" event={"ID":"1eac8153-6f6c-45de-a444-df3bfae897d1","Type":"ContainerDied","Data":"ccec4f13f40436d35baae2a4233753999f750ce51c44ebbc28ce4795ce325986"} Mar 19 09:51:10.176532 master-0 kubenswrapper[15202]: I0319 09:51:10.170263 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-sb\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.164500 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-sb\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.187912 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsmv9\" (UniqueName: \"kubernetes.io/projected/bde4d125-5422-48a5-809b-b7326315062c-kube-api-access-gsmv9\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.187988 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-svc\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188040 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-b\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188069 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-nb\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188106 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-ovndb-tls-certs\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188124 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgq8\" (UniqueName: \"kubernetes.io/projected/aa538e33-3e22-45d9-8109-6aaba8e9ee52-kube-api-access-9zgq8\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188218 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-swift-storage-0\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188273 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-a\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188306 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-config\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.189556 master-0 kubenswrapper[15202]: I0319 09:51:10.188422 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-config\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.193746 master-0 kubenswrapper[15202]: I0319 09:51:10.192796 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-httpd-config\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.197718 master-0 kubenswrapper[15202]: I0319 09:51:10.196466 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-nb\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.222437 master-0 kubenswrapper[15202]: I0319 09:51:10.212439 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-config\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.222437 master-0 kubenswrapper[15202]: I0319 09:51:10.213029 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-b\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.222437 master-0 kubenswrapper[15202]: I0319 09:51:10.221246 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-a\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.223001 master-0 kubenswrapper[15202]: I0319 09:51:10.222955 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-swift-storage-0\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.247012 master-0 kubenswrapper[15202]: I0319 09:51:10.241378 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687479ff9d-8shw8" event={"ID":"fa213423-98fc-446d-9208-33d884780995","Type":"ContainerStarted","Data":"9f93692bac9e51d66d621ba05b2a00361d12e9e51fa9284ff9722c9c29dd9a2f"} Mar 19 09:51:10.247012 master-0 kubenswrapper[15202]: I0319 09:51:10.241434 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687479ff9d-8shw8" event={"ID":"fa213423-98fc-446d-9208-33d884780995","Type":"ContainerStarted","Data":"037976b1a5e8e92d16755532488ffdbebd0e4c908e4d2426cb213e35e9515dcf"} Mar 19 09:51:10.247012 master-0 kubenswrapper[15202]: I0319 09:51:10.241604 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:10.247012 master-0 kubenswrapper[15202]: I0319 09:51:10.241654 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:10.247012 master-0 kubenswrapper[15202]: I0319 09:51:10.245818 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-svc\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.268445 master-0 kubenswrapper[15202]: I0319 09:51:10.254784 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgq8\" (UniqueName: \"kubernetes.io/projected/aa538e33-3e22-45d9-8109-6aaba8e9ee52-kube-api-access-9zgq8\") pod \"dnsmasq-dns-849fd5d677-sdj8j\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.277489 master-0 kubenswrapper[15202]: I0319 09:51:10.272130 15202 generic.go:334] "Generic (PLEG): container finished" podID="04e14c4d-4d08-4c3c-8803-d39b03125169" containerID="42ca44b5febf4f3dca1cd9cb5cb5b30c9ac65c7ccd22ac6082676dc8fbf27bd5" exitCode=0 Mar 19 09:51:10.277489 master-0 kubenswrapper[15202]: I0319 09:51:10.272183 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-db-sync-jdc2m" event={"ID":"04e14c4d-4d08-4c3c-8803-d39b03125169","Type":"ContainerDied","Data":"42ca44b5febf4f3dca1cd9cb5cb5b30c9ac65c7ccd22ac6082676dc8fbf27bd5"} Mar 19 09:51:10.308435 master-0 kubenswrapper[15202]: I0319 09:51:10.308003 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-httpd-config\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.308435 master-0 kubenswrapper[15202]: I0319 09:51:10.308134 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-combined-ca-bundle\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.308435 master-0 kubenswrapper[15202]: I0319 09:51:10.308194 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsmv9\" (UniqueName: \"kubernetes.io/projected/bde4d125-5422-48a5-809b-b7326315062c-kube-api-access-gsmv9\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.308435 master-0 kubenswrapper[15202]: I0319 09:51:10.308244 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-ovndb-tls-certs\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.308435 master-0 kubenswrapper[15202]: I0319 09:51:10.308296 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-config\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.326777 master-0 kubenswrapper[15202]: I0319 09:51:10.325687 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-combined-ca-bundle\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.327517 master-0 kubenswrapper[15202]: I0319 09:51:10.327429 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-config\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.360560 master-0 kubenswrapper[15202]: I0319 09:51:10.359685 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-httpd-config\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.372059 master-0 kubenswrapper[15202]: I0319 09:51:10.371950 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-ovndb-tls-certs\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.374068 master-0 kubenswrapper[15202]: I0319 09:51:10.373939 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsmv9\" (UniqueName: \"kubernetes.io/projected/bde4d125-5422-48a5-809b-b7326315062c-kube-api-access-gsmv9\") pod \"neutron-85f97d8d64-dfwgh\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.407918 master-0 kubenswrapper[15202]: I0319 09:51:10.405013 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-687479ff9d-8shw8" podStartSLOduration=7.404990495 podStartE2EDuration="7.404990495s" podCreationTimestamp="2026-03-19 09:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:10.310396894 +0000 UTC m=+1587.695811710" watchObservedRunningTime="2026-03-19 09:51:10.404990495 +0000 UTC m=+1587.790405311" Mar 19 09:51:10.427075 master-0 kubenswrapper[15202]: I0319 09:51:10.427011 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:10.524150 master-0 kubenswrapper[15202]: I0319 09:51:10.524091 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:10.656460 master-0 kubenswrapper[15202]: I0319 09:51:10.656299 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:51:10.673414 master-0 kubenswrapper[15202]: I0319 09:51:10.673289 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b44d66bc9-5zxbb"] Mar 19 09:51:10.692716 master-0 kubenswrapper[15202]: E0319 09:51:10.692658 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="init" Mar 19 09:51:10.692716 master-0 kubenswrapper[15202]: I0319 09:51:10.692709 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="init" Mar 19 09:51:10.692942 master-0 kubenswrapper[15202]: E0319 09:51:10.692805 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="dnsmasq-dns" Mar 19 09:51:10.692942 master-0 kubenswrapper[15202]: I0319 09:51:10.692815 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="dnsmasq-dns" Mar 19 09:51:10.693544 master-0 kubenswrapper[15202]: I0319 09:51:10.693524 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="dnsmasq-dns" Mar 19 09:51:10.697856 master-0 kubenswrapper[15202]: I0319 09:51:10.697825 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.702778 master-0 kubenswrapper[15202]: I0319 09:51:10.700328 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:51:10.702778 master-0 kubenswrapper[15202]: I0319 09:51:10.701180 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 09:51:10.702778 master-0 kubenswrapper[15202]: I0319 09:51:10.701412 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 09:51:10.702778 master-0 kubenswrapper[15202]: I0319 09:51:10.702352 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:51:10.702778 master-0 kubenswrapper[15202]: I0319 09:51:10.702524 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:51:10.720995 master-0 kubenswrapper[15202]: I0319 09:51:10.720342 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-nb\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.721306 master-0 kubenswrapper[15202]: I0319 09:51:10.721281 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-sb\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.721437 master-0 kubenswrapper[15202]: I0319 09:51:10.721400 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9jq\" (UniqueName: \"kubernetes.io/projected/1eac8153-6f6c-45de-a444-df3bfae897d1-kube-api-access-4g9jq\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.721710 master-0 kubenswrapper[15202]: I0319 09:51:10.721670 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-swift-storage-0\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.721949 master-0 kubenswrapper[15202]: I0319 09:51:10.721915 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-b\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.722007 master-0 kubenswrapper[15202]: I0319 09:51:10.721958 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-config\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.722045 master-0 kubenswrapper[15202]: I0319 09:51:10.722024 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-svc\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.722156 master-0 kubenswrapper[15202]: I0319 09:51:10.722067 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-a\") pod \"1eac8153-6f6c-45de-a444-df3bfae897d1\" (UID: \"1eac8153-6f6c-45de-a444-df3bfae897d1\") " Mar 19 09:51:10.722517 master-0 kubenswrapper[15202]: I0319 09:51:10.722415 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-public-tls-certs\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.722615 master-0 kubenswrapper[15202]: I0319 09:51:10.722564 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-config-data\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.722673 master-0 kubenswrapper[15202]: I0319 09:51:10.722610 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-credential-keys\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.722713 master-0 kubenswrapper[15202]: I0319 09:51:10.722671 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhllx\" (UniqueName: \"kubernetes.io/projected/c79cbfca-37c7-4d97-87a9-6da6333a6302-kube-api-access-dhllx\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.723057 master-0 kubenswrapper[15202]: I0319 09:51:10.722778 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-combined-ca-bundle\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.723057 master-0 kubenswrapper[15202]: I0319 09:51:10.722826 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-scripts\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.723057 master-0 kubenswrapper[15202]: I0319 09:51:10.722900 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-internal-tls-certs\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.723057 master-0 kubenswrapper[15202]: I0319 09:51:10.722934 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-fernet-keys\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.773320 master-0 kubenswrapper[15202]: I0319 09:51:10.772161 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eac8153-6f6c-45de-a444-df3bfae897d1-kube-api-access-4g9jq" (OuterVolumeSpecName: "kube-api-access-4g9jq") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "kube-api-access-4g9jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:10.774742 master-0 kubenswrapper[15202]: I0319 09:51:10.771499 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b44d66bc9-5zxbb"] Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.825287 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.827136 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhllx\" (UniqueName: \"kubernetes.io/projected/c79cbfca-37c7-4d97-87a9-6da6333a6302-kube-api-access-dhllx\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.827215 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-combined-ca-bundle\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.827265 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-scripts\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.827356 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-internal-tls-certs\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.827391 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-fernet-keys\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.827461 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-public-tls-certs\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.833881 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-config-data\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.834158 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-credential-keys\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.834398 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.834416 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9jq\" (UniqueName: \"kubernetes.io/projected/1eac8153-6f6c-45de-a444-df3bfae897d1-kube-api-access-4g9jq\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:10.839812 master-0 kubenswrapper[15202]: I0319 09:51:10.836884 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-scripts\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.858014 master-0 kubenswrapper[15202]: I0319 09:51:10.857968 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-public-tls-certs\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.858283 master-0 kubenswrapper[15202]: I0319 09:51:10.858232 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-combined-ca-bundle\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.858283 master-0 kubenswrapper[15202]: I0319 09:51:10.858270 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-config-data\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.861533 master-0 kubenswrapper[15202]: I0319 09:51:10.861271 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-fernet-keys\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.863725 master-0 kubenswrapper[15202]: I0319 09:51:10.863674 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-credential-keys\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.908124 master-0 kubenswrapper[15202]: I0319 09:51:10.908001 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c79cbfca-37c7-4d97-87a9-6da6333a6302-internal-tls-certs\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:10.979440 master-0 kubenswrapper[15202]: I0319 09:51:10.979250 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhllx\" (UniqueName: \"kubernetes.io/projected/c79cbfca-37c7-4d97-87a9-6da6333a6302-kube-api-access-dhllx\") pod \"keystone-6b44d66bc9-5zxbb\" (UID: \"c79cbfca-37c7-4d97-87a9-6da6333a6302\") " pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:11.033038 master-0 kubenswrapper[15202]: I0319 09:51:11.032838 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:11.033038 master-0 kubenswrapper[15202]: I0319 09:51:11.032943 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-config" (OuterVolumeSpecName: "config") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:11.039924 master-0 kubenswrapper[15202]: I0319 09:51:11.039882 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:11.045302 master-0 kubenswrapper[15202]: I0319 09:51:11.045247 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:11.045302 master-0 kubenswrapper[15202]: I0319 09:51:11.045283 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:11.045302 master-0 kubenswrapper[15202]: I0319 09:51:11.045294 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:11.061864 master-0 kubenswrapper[15202]: I0319 09:51:11.061684 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:11.070794 master-0 kubenswrapper[15202]: I0319 09:51:11.070537 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:11.090197 master-0 kubenswrapper[15202]: I0319 09:51:11.089863 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:11.090833 master-0 kubenswrapper[15202]: I0319 09:51:11.090782 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "1eac8153-6f6c-45de-a444-df3bfae897d1" (UID: "1eac8153-6f6c-45de-a444-df3bfae897d1"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:11.147168 master-0 kubenswrapper[15202]: I0319 09:51:11.147054 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:11.147168 master-0 kubenswrapper[15202]: I0319 09:51:11.147135 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:11.147168 master-0 kubenswrapper[15202]: I0319 09:51:11.147150 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1eac8153-6f6c-45de-a444-df3bfae897d1-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:11.310234 master-0 kubenswrapper[15202]: W0319 09:51:11.310151 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa538e33_3e22_45d9_8109_6aaba8e9ee52.slice/crio-0ab66b5325fc58c2ba3e8c2c8a9d41c32343eb21e820ea68259ea3076b5f298b WatchSource:0}: Error finding container 0ab66b5325fc58c2ba3e8c2c8a9d41c32343eb21e820ea68259ea3076b5f298b: Status 404 returned error can't find the container with id 0ab66b5325fc58c2ba3e8c2c8a9d41c32343eb21e820ea68259ea3076b5f298b Mar 19 09:51:11.311909 master-0 kubenswrapper[15202]: I0319 09:51:11.311787 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" event={"ID":"1eac8153-6f6c-45de-a444-df3bfae897d1","Type":"ContainerDied","Data":"be37d8cfe535a81320a9735a22a2969ad787c4f4ebea62c4e6be3c61acc0bf9e"} Mar 19 09:51:11.311992 master-0 kubenswrapper[15202]: I0319 09:51:11.311970 15202 scope.go:117] "RemoveContainer" containerID="ccec4f13f40436d35baae2a4233753999f750ce51c44ebbc28ce4795ce325986" Mar 19 09:51:11.312529 master-0 kubenswrapper[15202]: I0319 09:51:11.312497 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" Mar 19 09:51:11.336434 master-0 kubenswrapper[15202]: I0319 09:51:11.336379 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fd5d677-sdj8j"] Mar 19 09:51:11.505857 master-0 kubenswrapper[15202]: I0319 09:51:11.493385 15202 scope.go:117] "RemoveContainer" containerID="61b2b3a855256cabafbc5c00692e31b7226804281bd2d9d3e0c960784e0d8c32" Mar 19 09:51:11.552235 master-0 kubenswrapper[15202]: I0319 09:51:11.552139 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7cd95f9d78-s2fkv"] Mar 19 09:51:11.554398 master-0 kubenswrapper[15202]: I0319 09:51:11.554368 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.641884 master-0 kubenswrapper[15202]: I0319 09:51:11.641824 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cd95f9d78-s2fkv"] Mar 19 09:51:11.669614 master-0 kubenswrapper[15202]: I0319 09:51:11.668677 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2hnn\" (UniqueName: \"kubernetes.io/projected/9a13a111-1257-4963-8c30-51d28728449e-kube-api-access-w2hnn\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.669614 master-0 kubenswrapper[15202]: I0319 09:51:11.668783 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-ovndb-tls-certs\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.669614 master-0 kubenswrapper[15202]: I0319 09:51:11.668851 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-combined-ca-bundle\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.669614 master-0 kubenswrapper[15202]: I0319 09:51:11.668886 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-httpd-config\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.669614 master-0 kubenswrapper[15202]: I0319 09:51:11.668936 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-config\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.735596 master-0 kubenswrapper[15202]: I0319 09:51:11.734799 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ddd7f485-2r6bg"] Mar 19 09:51:11.745895 master-0 kubenswrapper[15202]: I0319 09:51:11.745728 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ddd7f485-2r6bg"] Mar 19 09:51:11.776597 master-0 kubenswrapper[15202]: I0319 09:51:11.775553 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-ovndb-tls-certs\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.776597 master-0 kubenswrapper[15202]: I0319 09:51:11.775698 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-combined-ca-bundle\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.776597 master-0 kubenswrapper[15202]: I0319 09:51:11.775794 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-httpd-config\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.776597 master-0 kubenswrapper[15202]: I0319 09:51:11.775896 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-config\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.776597 master-0 kubenswrapper[15202]: I0319 09:51:11.776050 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2hnn\" (UniqueName: \"kubernetes.io/projected/9a13a111-1257-4963-8c30-51d28728449e-kube-api-access-w2hnn\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.794130 master-0 kubenswrapper[15202]: I0319 09:51:11.786364 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-ovndb-tls-certs\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.794130 master-0 kubenswrapper[15202]: I0319 09:51:11.790921 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-httpd-config\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.794130 master-0 kubenswrapper[15202]: I0319 09:51:11.793688 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-config\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.798556 master-0 kubenswrapper[15202]: I0319 09:51:11.797723 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2hnn\" (UniqueName: \"kubernetes.io/projected/9a13a111-1257-4963-8c30-51d28728449e-kube-api-access-w2hnn\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.802016 master-0 kubenswrapper[15202]: I0319 09:51:11.801322 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-combined-ca-bundle\") pod \"neutron-7cd95f9d78-s2fkv\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.853532 master-0 kubenswrapper[15202]: W0319 09:51:11.853423 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc79cbfca_37c7_4d97_87a9_6da6333a6302.slice/crio-06093e727ab0a2089b50aeff4bbe91910cea2345c933981f16a5b6a24e021f2a WatchSource:0}: Error finding container 06093e727ab0a2089b50aeff4bbe91910cea2345c933981f16a5b6a24e021f2a: Status 404 returned error can't find the container with id 06093e727ab0a2089b50aeff4bbe91910cea2345c933981f16a5b6a24e021f2a Mar 19 09:51:11.865775 master-0 kubenswrapper[15202]: I0319 09:51:11.865728 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85f97d8d64-dfwgh"] Mar 19 09:51:11.877824 master-0 kubenswrapper[15202]: I0319 09:51:11.877760 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:11.877824 master-0 kubenswrapper[15202]: I0319 09:51:11.877787 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b44d66bc9-5zxbb"] Mar 19 09:51:11.924176 master-0 kubenswrapper[15202]: I0319 09:51:11.924134 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:51:12.092462 master-0 kubenswrapper[15202]: I0319 09:51:12.092392 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-scripts\") pod \"04e14c4d-4d08-4c3c-8803-d39b03125169\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " Mar 19 09:51:12.094694 master-0 kubenswrapper[15202]: I0319 09:51:12.094664 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j79gs\" (UniqueName: \"kubernetes.io/projected/04e14c4d-4d08-4c3c-8803-d39b03125169-kube-api-access-j79gs\") pod \"04e14c4d-4d08-4c3c-8803-d39b03125169\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " Mar 19 09:51:12.094824 master-0 kubenswrapper[15202]: I0319 09:51:12.094806 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-combined-ca-bundle\") pod \"04e14c4d-4d08-4c3c-8803-d39b03125169\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " Mar 19 09:51:12.094994 master-0 kubenswrapper[15202]: I0319 09:51:12.094973 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-config-data\") pod \"04e14c4d-4d08-4c3c-8803-d39b03125169\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " Mar 19 09:51:12.095048 master-0 kubenswrapper[15202]: I0319 09:51:12.095020 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04e14c4d-4d08-4c3c-8803-d39b03125169-etc-machine-id\") pod \"04e14c4d-4d08-4c3c-8803-d39b03125169\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " Mar 19 09:51:12.095122 master-0 kubenswrapper[15202]: I0319 09:51:12.095106 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-db-sync-config-data\") pod \"04e14c4d-4d08-4c3c-8803-d39b03125169\" (UID: \"04e14c4d-4d08-4c3c-8803-d39b03125169\") " Mar 19 09:51:12.096863 master-0 kubenswrapper[15202]: I0319 09:51:12.096747 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-scripts" (OuterVolumeSpecName: "scripts") pod "04e14c4d-4d08-4c3c-8803-d39b03125169" (UID: "04e14c4d-4d08-4c3c-8803-d39b03125169"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12.096863 master-0 kubenswrapper[15202]: I0319 09:51:12.096814 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04e14c4d-4d08-4c3c-8803-d39b03125169-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "04e14c4d-4d08-4c3c-8803-d39b03125169" (UID: "04e14c4d-4d08-4c3c-8803-d39b03125169"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:12.098744 master-0 kubenswrapper[15202]: I0319 09:51:12.098712 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e14c4d-4d08-4c3c-8803-d39b03125169-kube-api-access-j79gs" (OuterVolumeSpecName: "kube-api-access-j79gs") pod "04e14c4d-4d08-4c3c-8803-d39b03125169" (UID: "04e14c4d-4d08-4c3c-8803-d39b03125169"). InnerVolumeSpecName "kube-api-access-j79gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:12.102834 master-0 kubenswrapper[15202]: I0319 09:51:12.102778 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "04e14c4d-4d08-4c3c-8803-d39b03125169" (UID: "04e14c4d-4d08-4c3c-8803-d39b03125169"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12.198065 master-0 kubenswrapper[15202]: I0319 09:51:12.197982 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j79gs\" (UniqueName: \"kubernetes.io/projected/04e14c4d-4d08-4c3c-8803-d39b03125169-kube-api-access-j79gs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:12.198065 master-0 kubenswrapper[15202]: I0319 09:51:12.198035 15202 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04e14c4d-4d08-4c3c-8803-d39b03125169-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:12.198065 master-0 kubenswrapper[15202]: I0319 09:51:12.198049 15202 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:12.198065 master-0 kubenswrapper[15202]: I0319 09:51:12.198061 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:12.209238 master-0 kubenswrapper[15202]: I0319 09:51:12.209160 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04e14c4d-4d08-4c3c-8803-d39b03125169" (UID: "04e14c4d-4d08-4c3c-8803-d39b03125169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12.283168 master-0 kubenswrapper[15202]: I0319 09:51:12.281494 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-config-data" (OuterVolumeSpecName: "config-data") pod "04e14c4d-4d08-4c3c-8803-d39b03125169" (UID: "04e14c4d-4d08-4c3c-8803-d39b03125169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12.305530 master-0 kubenswrapper[15202]: I0319 09:51:12.304651 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:12.305530 master-0 kubenswrapper[15202]: I0319 09:51:12.304834 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e14c4d-4d08-4c3c-8803-d39b03125169-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:12.351319 master-0 kubenswrapper[15202]: I0319 09:51:12.349802 15202 generic.go:334] "Generic (PLEG): container finished" podID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerID="a8a3c7fabe39fe0cd4622e48f19c5485f4afbd93dd8b513868e57f0354d58ce3" exitCode=0 Mar 19 09:51:12.351567 master-0 kubenswrapper[15202]: I0319 09:51:12.351383 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" event={"ID":"aa538e33-3e22-45d9-8109-6aaba8e9ee52","Type":"ContainerDied","Data":"a8a3c7fabe39fe0cd4622e48f19c5485f4afbd93dd8b513868e57f0354d58ce3"} Mar 19 09:51:12.351567 master-0 kubenswrapper[15202]: I0319 09:51:12.351417 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" event={"ID":"aa538e33-3e22-45d9-8109-6aaba8e9ee52","Type":"ContainerStarted","Data":"0ab66b5325fc58c2ba3e8c2c8a9d41c32343eb21e820ea68259ea3076b5f298b"} Mar 19 09:51:12.357924 master-0 kubenswrapper[15202]: I0319 09:51:12.357857 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f97d8d64-dfwgh" event={"ID":"bde4d125-5422-48a5-809b-b7326315062c","Type":"ContainerStarted","Data":"5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859"} Mar 19 09:51:12.357924 master-0 kubenswrapper[15202]: I0319 09:51:12.357919 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f97d8d64-dfwgh" event={"ID":"bde4d125-5422-48a5-809b-b7326315062c","Type":"ContainerStarted","Data":"398e0e9ae83ff91111cde00109ab657c02c0a9a9f9134130abd316b594b5e31b"} Mar 19 09:51:12.361916 master-0 kubenswrapper[15202]: I0319 09:51:12.361814 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b44d66bc9-5zxbb" event={"ID":"c79cbfca-37c7-4d97-87a9-6da6333a6302","Type":"ContainerStarted","Data":"ffdcfbf1b86c18134445f3bbec21b860a4bc13630197fe2f4f962c0f99611712"} Mar 19 09:51:12.362171 master-0 kubenswrapper[15202]: I0319 09:51:12.361921 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b44d66bc9-5zxbb" event={"ID":"c79cbfca-37c7-4d97-87a9-6da6333a6302","Type":"ContainerStarted","Data":"06093e727ab0a2089b50aeff4bbe91910cea2345c933981f16a5b6a24e021f2a"} Mar 19 09:51:12.362761 master-0 kubenswrapper[15202]: I0319 09:51:12.362589 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:12.365635 master-0 kubenswrapper[15202]: I0319 09:51:12.365581 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-db-sync-jdc2m" event={"ID":"04e14c4d-4d08-4c3c-8803-d39b03125169","Type":"ContainerDied","Data":"7b2bddc13623d27f26a4e91cb34264c17999c7129802cb78be8b0045c4365d98"} Mar 19 09:51:12.365635 master-0 kubenswrapper[15202]: I0319 09:51:12.365634 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b2bddc13623d27f26a4e91cb34264c17999c7129802cb78be8b0045c4365d98" Mar 19 09:51:12.365871 master-0 kubenswrapper[15202]: I0319 09:51:12.365709 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-db-sync-jdc2m" Mar 19 09:51:12.450787 master-0 kubenswrapper[15202]: I0319 09:51:12.449699 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b44d66bc9-5zxbb" podStartSLOduration=2.4496754960000002 podStartE2EDuration="2.449675496s" podCreationTimestamp="2026-03-19 09:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:12.409403203 +0000 UTC m=+1589.794818039" watchObservedRunningTime="2026-03-19 09:51:12.449675496 +0000 UTC m=+1589.835090312" Mar 19 09:51:12.611321 master-0 kubenswrapper[15202]: I0319 09:51:12.611266 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cd95f9d78-s2fkv"] Mar 19 09:51:12.887427 master-0 kubenswrapper[15202]: I0319 09:51:12.855566 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" path="/var/lib/kubelet/pods/1eac8153-6f6c-45de-a444-df3bfae897d1/volumes" Mar 19 09:51:13.130523 master-0 kubenswrapper[15202]: I0319 09:51:13.130104 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:13.133062 master-0 kubenswrapper[15202]: E0319 09:51:13.131622 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e14c4d-4d08-4c3c-8803-d39b03125169" containerName="cinder-7ba05-db-sync" Mar 19 09:51:13.133062 master-0 kubenswrapper[15202]: I0319 09:51:13.131669 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e14c4d-4d08-4c3c-8803-d39b03125169" containerName="cinder-7ba05-db-sync" Mar 19 09:51:13.133062 master-0 kubenswrapper[15202]: I0319 09:51:13.132159 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e14c4d-4d08-4c3c-8803-d39b03125169" containerName="cinder-7ba05-db-sync" Mar 19 09:51:13.134025 master-0 kubenswrapper[15202]: I0319 09:51:13.133929 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.147947 master-0 kubenswrapper[15202]: I0319 09:51:13.147909 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-scripts" Mar 19 09:51:13.148235 master-0 kubenswrapper[15202]: I0319 09:51:13.148217 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-config-data" Mar 19 09:51:13.149948 master-0 kubenswrapper[15202]: I0319 09:51:13.149879 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-scheduler-config-data" Mar 19 09:51:13.252921 master-0 kubenswrapper[15202]: I0319 09:51:13.252858 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-scripts\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.253136 master-0 kubenswrapper[15202]: I0319 09:51:13.252933 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-combined-ca-bundle\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.253136 master-0 kubenswrapper[15202]: I0319 09:51:13.252995 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.253136 master-0 kubenswrapper[15202]: I0319 09:51:13.253050 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data-custom\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.253136 master-0 kubenswrapper[15202]: I0319 09:51:13.253107 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr4tx\" (UniqueName: \"kubernetes.io/projected/9939df7e-1cba-4d74-95d7-524376a36627-kube-api-access-tr4tx\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.253631 master-0 kubenswrapper[15202]: I0319 09:51:13.253284 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9939df7e-1cba-4d74-95d7-524376a36627-etc-machine-id\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.312614 master-0 kubenswrapper[15202]: I0319 09:51:13.302648 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:13.367944 master-0 kubenswrapper[15202]: I0319 09:51:13.366888 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-scripts\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.367944 master-0 kubenswrapper[15202]: I0319 09:51:13.366977 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-combined-ca-bundle\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.367944 master-0 kubenswrapper[15202]: I0319 09:51:13.367036 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.367944 master-0 kubenswrapper[15202]: I0319 09:51:13.367063 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data-custom\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.367944 master-0 kubenswrapper[15202]: I0319 09:51:13.367117 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr4tx\" (UniqueName: \"kubernetes.io/projected/9939df7e-1cba-4d74-95d7-524376a36627-kube-api-access-tr4tx\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.367944 master-0 kubenswrapper[15202]: I0319 09:51:13.367155 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9939df7e-1cba-4d74-95d7-524376a36627-etc-machine-id\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.370380 master-0 kubenswrapper[15202]: I0319 09:51:13.370288 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9939df7e-1cba-4d74-95d7-524376a36627-etc-machine-id\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.387686 master-0 kubenswrapper[15202]: I0319 09:51:13.383174 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.398263 master-0 kubenswrapper[15202]: I0319 09:51:13.398200 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-scripts\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.398547 master-0 kubenswrapper[15202]: I0319 09:51:13.398372 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data-custom\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.400238 master-0 kubenswrapper[15202]: I0319 09:51:13.400194 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-combined-ca-bundle\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.401711 master-0 kubenswrapper[15202]: I0319 09:51:13.401676 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cd95f9d78-s2fkv" event={"ID":"9a13a111-1257-4963-8c30-51d28728449e","Type":"ContainerStarted","Data":"7ff529613299b924c3d4cb1d4031e6538c493fe9beb8cb333c50be6f14dacc6a"} Mar 19 09:51:13.401778 master-0 kubenswrapper[15202]: I0319 09:51:13.401724 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cd95f9d78-s2fkv" event={"ID":"9a13a111-1257-4963-8c30-51d28728449e","Type":"ContainerStarted","Data":"79e3f9cfbfd95ceea9237898839f5bb3b9fef4b984a43e09f631cc348938fbf1"} Mar 19 09:51:13.410359 master-0 kubenswrapper[15202]: I0319 09:51:13.410309 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" event={"ID":"aa538e33-3e22-45d9-8109-6aaba8e9ee52","Type":"ContainerStarted","Data":"9fe984be1016c4aecd435116d8999eaf4799d54dfc85177fffc80d84458462ae"} Mar 19 09:51:13.410685 master-0 kubenswrapper[15202]: I0319 09:51:13.410667 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:13.414657 master-0 kubenswrapper[15202]: I0319 09:51:13.414605 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f97d8d64-dfwgh" event={"ID":"bde4d125-5422-48a5-809b-b7326315062c","Type":"ContainerStarted","Data":"4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e"} Mar 19 09:51:13.415208 master-0 kubenswrapper[15202]: I0319 09:51:13.415184 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:13.549118 master-0 kubenswrapper[15202]: I0319 09:51:13.549013 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:13.556362 master-0 kubenswrapper[15202]: I0319 09:51:13.556273 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.558935 master-0 kubenswrapper[15202]: I0319 09:51:13.558882 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-volume-lvm-iscsi-config-data" Mar 19 09:51:13.593749 master-0 kubenswrapper[15202]: I0319 09:51:13.593562 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:13.598270 master-0 kubenswrapper[15202]: I0319 09:51:13.598198 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-lib-modules\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.599405 master-0 kubenswrapper[15202]: I0319 09:51:13.599361 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-dev\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.599544 master-0 kubenswrapper[15202]: I0319 09:51:13.599500 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-sys\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.600110 master-0 kubenswrapper[15202]: I0319 09:51:13.599808 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.600263 master-0 kubenswrapper[15202]: I0319 09:51:13.600224 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-run\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.600370 master-0 kubenswrapper[15202]: I0319 09:51:13.600343 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-iscsi\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.600612 master-0 kubenswrapper[15202]: I0319 09:51:13.600541 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-scripts\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.600701 master-0 kubenswrapper[15202]: I0319 09:51:13.600641 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-machine-id\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.600761 master-0 kubenswrapper[15202]: I0319 09:51:13.600697 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.601198 master-0 kubenswrapper[15202]: I0319 09:51:13.601136 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-combined-ca-bundle\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.601575 master-0 kubenswrapper[15202]: I0319 09:51:13.601527 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-nvme\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.601964 master-0 kubenswrapper[15202]: I0319 09:51:13.601752 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnhg2\" (UniqueName: \"kubernetes.io/projected/a8885e16-c286-4619-9b4e-4d7ae54d5753-kube-api-access-cnhg2\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.602423 master-0 kubenswrapper[15202]: I0319 09:51:13.602400 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data-custom\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.602580 master-0 kubenswrapper[15202]: I0319 09:51:13.602558 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-lib-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.602699 master-0 kubenswrapper[15202]: I0319 09:51:13.602679 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-brick\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.607035 master-0 kubenswrapper[15202]: I0319 09:51:13.607003 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.610145 master-0 kubenswrapper[15202]: I0319 09:51:13.610075 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-backup-config-data" Mar 19 09:51:13.704819 master-0 kubenswrapper[15202]: I0319 09:51:13.704753 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data-custom\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.704819 master-0 kubenswrapper[15202]: I0319 09:51:13.704813 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-lib-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.704858 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-brick\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.704902 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-machine-id\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.704987 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-lib-modules\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.705020 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-lib-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.705049 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-dev\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.705080 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-dev\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.705098 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-brick\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705140 master-0 kubenswrapper[15202]: I0319 09:51:13.705126 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-sys\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705147 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-sys\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705167 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-combined-ca-bundle\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705198 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-run\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705228 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705270 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-run\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705303 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data-custom\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705330 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-scripts\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705377 master-0 kubenswrapper[15202]: I0319 09:51:13.705364 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-iscsi\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705642 master-0 kubenswrapper[15202]: I0319 09:51:13.705407 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-nvme\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705642 master-0 kubenswrapper[15202]: I0319 09:51:13.705432 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-iscsi\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.705642 master-0 kubenswrapper[15202]: I0319 09:51:13.705485 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-scripts\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.705739 master-0 kubenswrapper[15202]: I0319 09:51:13.705521 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-machine-id\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706642 master-0 kubenswrapper[15202]: I0319 09:51:13.705853 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-lib-modules\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706642 master-0 kubenswrapper[15202]: I0319 09:51:13.705969 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-machine-id\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.705998 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-sys\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706004 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-run\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706311 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-brick\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706351 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-dev\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706572 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-iscsi\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706617 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706674 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-lib-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706735 master-0 kubenswrapper[15202]: I0319 09:51:13.706731 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.706965 master-0 kubenswrapper[15202]: I0319 09:51:13.706806 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-lib-modules\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.706965 master-0 kubenswrapper[15202]: I0319 09:51:13.706834 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-combined-ca-bundle\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.707056 master-0 kubenswrapper[15202]: I0319 09:51:13.707022 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-nvme\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.707096 master-0 kubenswrapper[15202]: I0319 09:51:13.707073 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.707155 master-0 kubenswrapper[15202]: I0319 09:51:13.707128 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxsb8\" (UniqueName: \"kubernetes.io/projected/19c324ae-8b95-430e-b544-90e2a4b5ff33-kube-api-access-rxsb8\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.707194 master-0 kubenswrapper[15202]: I0319 09:51:13.707180 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-nvme\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.707194 master-0 kubenswrapper[15202]: I0319 09:51:13.707183 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhg2\" (UniqueName: \"kubernetes.io/projected/a8885e16-c286-4619-9b4e-4d7ae54d5753-kube-api-access-cnhg2\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.707264 master-0 kubenswrapper[15202]: I0319 09:51:13.707226 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.711404 master-0 kubenswrapper[15202]: I0319 09:51:13.711350 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-scripts\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.711687 master-0 kubenswrapper[15202]: I0319 09:51:13.711082 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-combined-ca-bundle\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.711847 master-0 kubenswrapper[15202]: I0319 09:51:13.711817 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data-custom\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.715057 master-0 kubenswrapper[15202]: I0319 09:51:13.715024 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.808901 master-0 kubenswrapper[15202]: I0319 09:51:13.808820 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809177 master-0 kubenswrapper[15202]: I0319 09:51:13.809036 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809237 master-0 kubenswrapper[15202]: I0319 09:51:13.809125 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-machine-id\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809284 master-0 kubenswrapper[15202]: I0319 09:51:13.809168 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-machine-id\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809364 master-0 kubenswrapper[15202]: I0319 09:51:13.809331 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-lib-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809412 master-0 kubenswrapper[15202]: I0319 09:51:13.809369 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-dev\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809490 master-0 kubenswrapper[15202]: I0319 09:51:13.809449 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-brick\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809557 master-0 kubenswrapper[15202]: I0319 09:51:13.809508 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-lib-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809608 master-0 kubenswrapper[15202]: I0319 09:51:13.809555 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-sys\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809608 master-0 kubenswrapper[15202]: I0319 09:51:13.809588 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-combined-ca-bundle\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809608 master-0 kubenswrapper[15202]: I0319 09:51:13.809584 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-dev\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809916 master-0 kubenswrapper[15202]: I0319 09:51:13.809632 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-brick\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809916 master-0 kubenswrapper[15202]: I0319 09:51:13.809641 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-run\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.809916 master-0 kubenswrapper[15202]: I0319 09:51:13.809658 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-sys\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.810313 master-0 kubenswrapper[15202]: I0319 09:51:13.810250 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-run\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.810392 master-0 kubenswrapper[15202]: I0319 09:51:13.810338 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data-custom\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.810444 master-0 kubenswrapper[15202]: I0319 09:51:13.810416 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-scripts\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.810617 master-0 kubenswrapper[15202]: I0319 09:51:13.810589 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-nvme\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.810693 master-0 kubenswrapper[15202]: I0319 09:51:13.810626 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-iscsi\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.810964 master-0 kubenswrapper[15202]: I0319 09:51:13.810941 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-lib-modules\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.811056 master-0 kubenswrapper[15202]: I0319 09:51:13.811034 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.811117 master-0 kubenswrapper[15202]: I0319 09:51:13.811078 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxsb8\" (UniqueName: \"kubernetes.io/projected/19c324ae-8b95-430e-b544-90e2a4b5ff33-kube-api-access-rxsb8\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.811411 master-0 kubenswrapper[15202]: I0319 09:51:13.811369 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-iscsi\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.811549 master-0 kubenswrapper[15202]: I0319 09:51:13.811501 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-lib-modules\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.812006 master-0 kubenswrapper[15202]: I0319 09:51:13.811808 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-nvme\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.820350 master-0 kubenswrapper[15202]: I0319 09:51:13.816679 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.820350 master-0 kubenswrapper[15202]: I0319 09:51:13.819920 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-scripts\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.820350 master-0 kubenswrapper[15202]: I0319 09:51:13.820264 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data-custom\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.838652 master-0 kubenswrapper[15202]: I0319 09:51:13.838107 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-combined-ca-bundle\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.849069 master-0 kubenswrapper[15202]: I0319 09:51:13.848231 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr4tx\" (UniqueName: \"kubernetes.io/projected/9939df7e-1cba-4d74-95d7-524376a36627-kube-api-access-tr4tx\") pod \"cinder-7ba05-scheduler-0\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:13.859275 master-0 kubenswrapper[15202]: I0319 09:51:13.859209 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:13.863756 master-0 kubenswrapper[15202]: I0319 09:51:13.863695 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnhg2\" (UniqueName: \"kubernetes.io/projected/a8885e16-c286-4619-9b4e-4d7ae54d5753-kube-api-access-cnhg2\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.868260 master-0 kubenswrapper[15202]: I0319 09:51:13.868212 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxsb8\" (UniqueName: \"kubernetes.io/projected/19c324ae-8b95-430e-b544-90e2a4b5ff33-kube-api-access-rxsb8\") pod \"cinder-7ba05-backup-0\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:13.874127 master-0 kubenswrapper[15202]: I0319 09:51:13.874092 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:13.879896 master-0 kubenswrapper[15202]: I0319 09:51:13.879801 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" podStartSLOduration=4.879774396 podStartE2EDuration="4.879774396s" podCreationTimestamp="2026-03-19 09:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:13.838414297 +0000 UTC m=+1591.223829123" watchObservedRunningTime="2026-03-19 09:51:13.879774396 +0000 UTC m=+1591.265189222" Mar 19 09:51:13.907720 master-0 kubenswrapper[15202]: I0319 09:51:13.907632 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:13.951606 master-0 kubenswrapper[15202]: I0319 09:51:13.951455 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:14.019491 master-0 kubenswrapper[15202]: I0319 09:51:14.017256 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fd5d677-sdj8j"] Mar 19 09:51:14.039528 master-0 kubenswrapper[15202]: I0319 09:51:14.039422 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85f97d8d64-dfwgh" podStartSLOduration=5.039399851 podStartE2EDuration="5.039399851s" podCreationTimestamp="2026-03-19 09:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:14.012855487 +0000 UTC m=+1591.398270333" watchObservedRunningTime="2026-03-19 09:51:14.039399851 +0000 UTC m=+1591.424814667" Mar 19 09:51:14.071092 master-0 kubenswrapper[15202]: I0319 09:51:14.071039 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:14.199507 master-0 kubenswrapper[15202]: I0319 09:51:14.193491 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6897ccd865-b6qgp"] Mar 19 09:51:14.199507 master-0 kubenswrapper[15202]: I0319 09:51:14.195784 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.220501 master-0 kubenswrapper[15202]: I0319 09:51:14.214494 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6897ccd865-b6qgp"] Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.369586 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxtmd\" (UniqueName: \"kubernetes.io/projected/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-kube-api-access-kxtmd\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.369687 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.369774 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-config\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.369913 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.370001 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-svc\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.370084 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-swift-storage-0\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.370144 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-b\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.372622 master-0 kubenswrapper[15202]: I0319 09:51:14.370187 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-a\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.436318 master-0 kubenswrapper[15202]: I0319 09:51:14.435617 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:14.438008 master-0 kubenswrapper[15202]: I0319 09:51:14.437925 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.442214 master-0 kubenswrapper[15202]: I0319 09:51:14.441963 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-api-config-data" Mar 19 09:51:14.449102 master-0 kubenswrapper[15202]: I0319 09:51:14.449006 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cd95f9d78-s2fkv" event={"ID":"9a13a111-1257-4963-8c30-51d28728449e","Type":"ContainerStarted","Data":"006abe7cbc3153c692320c85322b89c89f63c8ad1b8605e8c10f9ff7418e02cf"} Mar 19 09:51:14.449661 master-0 kubenswrapper[15202]: I0319 09:51:14.449498 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:14.472593 master-0 kubenswrapper[15202]: I0319 09:51:14.472379 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data-custom\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.472593 master-0 kubenswrapper[15202]: I0319 09:51:14.472445 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-config\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.472593 master-0 kubenswrapper[15202]: I0319 09:51:14.472544 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472603 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-combined-ca-bundle\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472666 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-svc\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472705 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472754 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7z9d\" (UniqueName: \"kubernetes.io/projected/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-kube-api-access-d7z9d\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472778 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-swift-storage-0\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472836 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-b\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.472874 master-0 kubenswrapper[15202]: I0319 09:51:14.472868 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-a\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.473089 master-0 kubenswrapper[15202]: I0319 09:51:14.472898 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxtmd\" (UniqueName: \"kubernetes.io/projected/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-kube-api-access-kxtmd\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.473089 master-0 kubenswrapper[15202]: I0319 09:51:14.472918 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-etc-machine-id\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.473089 master-0 kubenswrapper[15202]: I0319 09:51:14.472942 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-scripts\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.473089 master-0 kubenswrapper[15202]: I0319 09:51:14.472968 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-logs\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.473089 master-0 kubenswrapper[15202]: I0319 09:51:14.472991 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.473984 master-0 kubenswrapper[15202]: I0319 09:51:14.473886 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.474499 master-0 kubenswrapper[15202]: I0319 09:51:14.474445 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-swift-storage-0\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.474895 master-0 kubenswrapper[15202]: I0319 09:51:14.474869 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-config\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.475251 master-0 kubenswrapper[15202]: I0319 09:51:14.475222 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-b\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.475848 master-0 kubenswrapper[15202]: I0319 09:51:14.475826 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.477510 master-0 kubenswrapper[15202]: I0319 09:51:14.475960 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-a\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.478454 master-0 kubenswrapper[15202]: I0319 09:51:14.478428 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-svc\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.499757 master-0 kubenswrapper[15202]: I0319 09:51:14.499554 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxtmd\" (UniqueName: \"kubernetes.io/projected/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-kube-api-access-kxtmd\") pod \"dnsmasq-dns-6897ccd865-b6qgp\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.540899 master-0 kubenswrapper[15202]: I0319 09:51:14.540831 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:14.576626 master-0 kubenswrapper[15202]: I0319 09:51:14.552734 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:14.577320 master-0 kubenswrapper[15202]: I0319 09:51:14.576914 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-combined-ca-bundle\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.577320 master-0 kubenswrapper[15202]: I0319 09:51:14.577057 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.577320 master-0 kubenswrapper[15202]: I0319 09:51:14.577127 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7z9d\" (UniqueName: \"kubernetes.io/projected/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-kube-api-access-d7z9d\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.577320 master-0 kubenswrapper[15202]: I0319 09:51:14.577303 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-etc-machine-id\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.577522 master-0 kubenswrapper[15202]: I0319 09:51:14.577378 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-scripts\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.577522 master-0 kubenswrapper[15202]: I0319 09:51:14.577448 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-logs\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.577894 master-0 kubenswrapper[15202]: I0319 09:51:14.577863 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data-custom\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.582776 master-0 kubenswrapper[15202]: I0319 09:51:14.581577 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-etc-machine-id\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.585106 master-0 kubenswrapper[15202]: I0319 09:51:14.582573 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-logs\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.592180 master-0 kubenswrapper[15202]: I0319 09:51:14.592118 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-scripts\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.593633 master-0 kubenswrapper[15202]: I0319 09:51:14.593260 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data-custom\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.604593 master-0 kubenswrapper[15202]: I0319 09:51:14.602911 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.604593 master-0 kubenswrapper[15202]: I0319 09:51:14.604305 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-combined-ca-bundle\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.604593 master-0 kubenswrapper[15202]: I0319 09:51:14.604391 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7z9d\" (UniqueName: \"kubernetes.io/projected/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-kube-api-access-d7z9d\") pod \"cinder-7ba05-api-0\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.631602 master-0 kubenswrapper[15202]: I0319 09:51:14.629861 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7cd95f9d78-s2fkv" podStartSLOduration=3.629836975 podStartE2EDuration="3.629836975s" podCreationTimestamp="2026-03-19 09:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:14.494743365 +0000 UTC m=+1591.880158191" watchObservedRunningTime="2026-03-19 09:51:14.629836975 +0000 UTC m=+1592.015251791" Mar 19 09:51:14.772115 master-0 kubenswrapper[15202]: I0319 09:51:14.772040 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:14.927672 master-0 kubenswrapper[15202]: I0319 09:51:14.927621 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:14.977586 master-0 kubenswrapper[15202]: W0319 09:51:14.976732 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8885e16_c286_4619_9b4e_4d7ae54d5753.slice/crio-0a288268ee6e842f4f0be884ed78277d2d4f2ecb7533e8bc4d3c83a7bde6a5bd WatchSource:0}: Error finding container 0a288268ee6e842f4f0be884ed78277d2d4f2ecb7533e8bc4d3c83a7bde6a5bd: Status 404 returned error can't find the container with id 0a288268ee6e842f4f0be884ed78277d2d4f2ecb7533e8bc4d3c83a7bde6a5bd Mar 19 09:51:14.979457 master-0 kubenswrapper[15202]: I0319 09:51:14.978841 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:14.996673 master-0 kubenswrapper[15202]: I0319 09:51:14.996095 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:15.170573 master-0 kubenswrapper[15202]: I0319 09:51:15.170491 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6897ccd865-b6qgp"] Mar 19 09:51:15.184529 master-0 kubenswrapper[15202]: W0319 09:51:15.183910 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce4bbb6c_4df8_45cf_b21d_75dab94bfaa8.slice/crio-b6761f91cd49e2b5e0769df26952258e71b0c64735a00f1c17ae75b49f39b2f2 WatchSource:0}: Error finding container b6761f91cd49e2b5e0769df26952258e71b0c64735a00f1c17ae75b49f39b2f2: Status 404 returned error can't find the container with id b6761f91cd49e2b5e0769df26952258e71b0c64735a00f1c17ae75b49f39b2f2 Mar 19 09:51:15.272877 master-0 kubenswrapper[15202]: I0319 09:51:15.272778 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6ddd7f485-2r6bg" podUID="1eac8153-6f6c-45de-a444-df3bfae897d1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.223:5353: i/o timeout" Mar 19 09:51:15.472309 master-0 kubenswrapper[15202]: I0319 09:51:15.472270 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"a8885e16-c286-4619-9b4e-4d7ae54d5753","Type":"ContainerStarted","Data":"0a288268ee6e842f4f0be884ed78277d2d4f2ecb7533e8bc4d3c83a7bde6a5bd"} Mar 19 09:51:15.474530 master-0 kubenswrapper[15202]: I0319 09:51:15.474498 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" event={"ID":"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8","Type":"ContainerStarted","Data":"b6761f91cd49e2b5e0769df26952258e71b0c64735a00f1c17ae75b49f39b2f2"} Mar 19 09:51:15.476528 master-0 kubenswrapper[15202]: I0319 09:51:15.476356 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"9939df7e-1cba-4d74-95d7-524376a36627","Type":"ContainerStarted","Data":"bf3ffd7f07ae452dfcf4688538269ac19f8f412f9044ed366c9155a28678d2f9"} Mar 19 09:51:15.478425 master-0 kubenswrapper[15202]: I0319 09:51:15.478383 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"19c324ae-8b95-430e-b544-90e2a4b5ff33","Type":"ContainerStarted","Data":"8f79868713decfab9207d90f18f83e63e36cd1bfcff3ee7330acc2d2eb3b740e"} Mar 19 09:51:15.478629 master-0 kubenswrapper[15202]: I0319 09:51:15.478595 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="dnsmasq-dns" containerID="cri-o://9fe984be1016c4aecd435116d8999eaf4799d54dfc85177fffc80d84458462ae" gracePeriod=10 Mar 19 09:51:15.512346 master-0 kubenswrapper[15202]: I0319 09:51:15.512283 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:16.494415 master-0 kubenswrapper[15202]: I0319 09:51:16.494346 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701","Type":"ContainerStarted","Data":"fa1442e50632b2261b70cb5220b1069da73b666d271f0d741f14a123a3bc9d66"} Mar 19 09:51:16.494415 master-0 kubenswrapper[15202]: I0319 09:51:16.494404 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701","Type":"ContainerStarted","Data":"09b4f82052fdd2094412a115980526cbccda26d268526aa6dc3bc88ea9b61944"} Mar 19 09:51:16.496836 master-0 kubenswrapper[15202]: I0319 09:51:16.496584 15202 generic.go:334] "Generic (PLEG): container finished" podID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerID="46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7" exitCode=0 Mar 19 09:51:16.496836 master-0 kubenswrapper[15202]: I0319 09:51:16.496651 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" event={"ID":"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8","Type":"ContainerDied","Data":"46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7"} Mar 19 09:51:16.503889 master-0 kubenswrapper[15202]: I0319 09:51:16.503401 15202 generic.go:334] "Generic (PLEG): container finished" podID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerID="9fe984be1016c4aecd435116d8999eaf4799d54dfc85177fffc80d84458462ae" exitCode=0 Mar 19 09:51:16.503889 master-0 kubenswrapper[15202]: I0319 09:51:16.503451 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" event={"ID":"aa538e33-3e22-45d9-8109-6aaba8e9ee52","Type":"ContainerDied","Data":"9fe984be1016c4aecd435116d8999eaf4799d54dfc85177fffc80d84458462ae"} Mar 19 09:51:17.668750 master-0 kubenswrapper[15202]: I0319 09:51:17.668674 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:17.749873 master-0 kubenswrapper[15202]: I0319 09:51:17.749809 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85f97d8d64-dfwgh"] Mar 19 09:51:17.753847 master-0 kubenswrapper[15202]: I0319 09:51:17.750523 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85f97d8d64-dfwgh" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-api" containerID="cri-o://5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859" gracePeriod=30 Mar 19 09:51:17.754258 master-0 kubenswrapper[15202]: I0319 09:51:17.751092 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85f97d8d64-dfwgh" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-httpd" containerID="cri-o://4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e" gracePeriod=30 Mar 19 09:51:17.784910 master-0 kubenswrapper[15202]: I0319 09:51:17.784843 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-85f97d8d64-dfwgh" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.128.0.226:9696/\": EOF" Mar 19 09:51:18.433442 master-0 kubenswrapper[15202]: I0319 09:51:18.433338 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77db675565-g4zz2"] Mar 19 09:51:18.436350 master-0 kubenswrapper[15202]: I0319 09:51:18.436281 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.439820 master-0 kubenswrapper[15202]: I0319 09:51:18.438913 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 09:51:18.439820 master-0 kubenswrapper[15202]: I0319 09:51:18.439108 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 09:51:18.448198 master-0 kubenswrapper[15202]: I0319 09:51:18.448132 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77db675565-g4zz2"] Mar 19 09:51:18.618859 master-0 kubenswrapper[15202]: I0319 09:51:18.618782 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-ovndb-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.618859 master-0 kubenswrapper[15202]: I0319 09:51:18.618856 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-combined-ca-bundle\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.619254 master-0 kubenswrapper[15202]: I0319 09:51:18.619204 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-httpd-config\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.619410 master-0 kubenswrapper[15202]: I0319 09:51:18.619368 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-config\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.619600 master-0 kubenswrapper[15202]: I0319 09:51:18.619573 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfsp\" (UniqueName: \"kubernetes.io/projected/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-kube-api-access-qwfsp\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.619764 master-0 kubenswrapper[15202]: I0319 09:51:18.619737 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-internal-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.619861 master-0 kubenswrapper[15202]: I0319 09:51:18.619837 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-public-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.722316 master-0 kubenswrapper[15202]: I0319 09:51:18.722250 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-httpd-config\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.722934 master-0 kubenswrapper[15202]: I0319 09:51:18.722337 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-config\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.722934 master-0 kubenswrapper[15202]: I0319 09:51:18.722397 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfsp\" (UniqueName: \"kubernetes.io/projected/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-kube-api-access-qwfsp\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.722934 master-0 kubenswrapper[15202]: I0319 09:51:18.722637 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-internal-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.722934 master-0 kubenswrapper[15202]: I0319 09:51:18.722819 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-public-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.723080 master-0 kubenswrapper[15202]: I0319 09:51:18.722984 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-ovndb-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.723080 master-0 kubenswrapper[15202]: I0319 09:51:18.723027 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-combined-ca-bundle\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.727435 master-0 kubenswrapper[15202]: I0319 09:51:18.727190 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-internal-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.727775 master-0 kubenswrapper[15202]: I0319 09:51:18.727716 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-httpd-config\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.730373 master-0 kubenswrapper[15202]: I0319 09:51:18.730232 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-public-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.733537 master-0 kubenswrapper[15202]: I0319 09:51:18.733486 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-ovndb-tls-certs\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.738439 master-0 kubenswrapper[15202]: I0319 09:51:18.736214 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-config\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.738439 master-0 kubenswrapper[15202]: I0319 09:51:18.736596 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-combined-ca-bundle\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.749450 master-0 kubenswrapper[15202]: I0319 09:51:18.740171 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfsp\" (UniqueName: \"kubernetes.io/projected/1c78c47a-7a9a-4835-92ed-a3da198b2cc8-kube-api-access-qwfsp\") pod \"neutron-77db675565-g4zz2\" (UID: \"1c78c47a-7a9a-4835-92ed-a3da198b2cc8\") " pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:18.789681 master-0 kubenswrapper[15202]: I0319 09:51:18.789475 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:20.586596 master-0 kubenswrapper[15202]: I0319 09:51:20.586529 15202 generic.go:334] "Generic (PLEG): container finished" podID="bde4d125-5422-48a5-809b-b7326315062c" containerID="4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e" exitCode=0 Mar 19 09:51:20.586596 master-0 kubenswrapper[15202]: I0319 09:51:20.586595 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f97d8d64-dfwgh" event={"ID":"bde4d125-5422-48a5-809b-b7326315062c","Type":"ContainerDied","Data":"4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e"} Mar 19 09:51:20.872805 master-0 kubenswrapper[15202]: I0319 09:51:20.872687 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:21.009497 master-0 kubenswrapper[15202]: I0319 09:51:21.009142 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zgq8\" (UniqueName: \"kubernetes.io/projected/aa538e33-3e22-45d9-8109-6aaba8e9ee52-kube-api-access-9zgq8\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.009497 master-0 kubenswrapper[15202]: I0319 09:51:21.009200 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-nb\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.009497 master-0 kubenswrapper[15202]: I0319 09:51:21.009225 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-b\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.009497 master-0 kubenswrapper[15202]: I0319 09:51:21.009258 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-sb\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.009497 master-0 kubenswrapper[15202]: I0319 09:51:21.009299 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-swift-storage-0\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.012595 master-0 kubenswrapper[15202]: I0319 09:51:21.010926 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-svc\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.012595 master-0 kubenswrapper[15202]: I0319 09:51:21.011060 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-config\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.012595 master-0 kubenswrapper[15202]: I0319 09:51:21.011225 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-a\") pod \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\" (UID: \"aa538e33-3e22-45d9-8109-6aaba8e9ee52\") " Mar 19 09:51:21.020837 master-0 kubenswrapper[15202]: I0319 09:51:21.016571 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa538e33-3e22-45d9-8109-6aaba8e9ee52-kube-api-access-9zgq8" (OuterVolumeSpecName: "kube-api-access-9zgq8") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "kube-api-access-9zgq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:21.065657 master-0 kubenswrapper[15202]: I0319 09:51:21.065591 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.069263 master-0 kubenswrapper[15202]: I0319 09:51:21.069197 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-config" (OuterVolumeSpecName: "config") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.071366 master-0 kubenswrapper[15202]: I0319 09:51:21.071293 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.088137 master-0 kubenswrapper[15202]: I0319 09:51:21.087622 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.091425 master-0 kubenswrapper[15202]: I0319 09:51:21.090774 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.092248 master-0 kubenswrapper[15202]: I0319 09:51:21.092178 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.101973 master-0 kubenswrapper[15202]: I0319 09:51:21.101820 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa538e33-3e22-45d9-8109-6aaba8e9ee52" (UID: "aa538e33-3e22-45d9-8109-6aaba8e9ee52"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116297 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zgq8\" (UniqueName: \"kubernetes.io/projected/aa538e33-3e22-45d9-8109-6aaba8e9ee52-kube-api-access-9zgq8\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116350 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116365 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116379 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116391 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116403 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116418 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.116517 master-0 kubenswrapper[15202]: I0319 09:51:21.116430 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/aa538e33-3e22-45d9-8109-6aaba8e9ee52-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:21.598662 master-0 kubenswrapper[15202]: I0319 09:51:21.598331 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" event={"ID":"aa538e33-3e22-45d9-8109-6aaba8e9ee52","Type":"ContainerDied","Data":"0ab66b5325fc58c2ba3e8c2c8a9d41c32343eb21e820ea68259ea3076b5f298b"} Mar 19 09:51:21.599183 master-0 kubenswrapper[15202]: I0319 09:51:21.598741 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" Mar 19 09:51:21.599183 master-0 kubenswrapper[15202]: I0319 09:51:21.598865 15202 scope.go:117] "RemoveContainer" containerID="9fe984be1016c4aecd435116d8999eaf4799d54dfc85177fffc80d84458462ae" Mar 19 09:51:21.661497 master-0 kubenswrapper[15202]: I0319 09:51:21.659920 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fd5d677-sdj8j"] Mar 19 09:51:21.677031 master-0 kubenswrapper[15202]: I0319 09:51:21.674122 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849fd5d677-sdj8j"] Mar 19 09:51:21.784982 master-0 kubenswrapper[15202]: I0319 09:51:21.782476 15202 scope.go:117] "RemoveContainer" containerID="a8a3c7fabe39fe0cd4622e48f19c5485f4afbd93dd8b513868e57f0354d58ce3" Mar 19 09:51:22.186572 master-0 kubenswrapper[15202]: I0319 09:51:22.185718 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77db675565-g4zz2"] Mar 19 09:51:22.636012 master-0 kubenswrapper[15202]: I0319 09:51:22.635966 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" event={"ID":"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8","Type":"ContainerStarted","Data":"c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd"} Mar 19 09:51:22.637205 master-0 kubenswrapper[15202]: I0319 09:51:22.636676 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:22.644680 master-0 kubenswrapper[15202]: I0319 09:51:22.642701 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77db675565-g4zz2" event={"ID":"1c78c47a-7a9a-4835-92ed-a3da198b2cc8","Type":"ContainerStarted","Data":"fb31de1a938e32270a93aa70137068ad41d153acb5ae53e7e2695669c47028ac"} Mar 19 09:51:22.644680 master-0 kubenswrapper[15202]: I0319 09:51:22.642750 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77db675565-g4zz2" event={"ID":"1c78c47a-7a9a-4835-92ed-a3da198b2cc8","Type":"ContainerStarted","Data":"3d74af9191cc12a82af5888a46e3c3cec45cf6b9f65affe4b56af1a699ff524f"} Mar 19 09:51:22.644680 master-0 kubenswrapper[15202]: I0319 09:51:22.644412 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"19c324ae-8b95-430e-b544-90e2a4b5ff33","Type":"ContainerStarted","Data":"7bc8077f80a1a873040b36173eedf2d6f180726ba0a067e4a115c34e9692097e"} Mar 19 09:51:22.653394 master-0 kubenswrapper[15202]: I0319 09:51:22.653338 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"a8885e16-c286-4619-9b4e-4d7ae54d5753","Type":"ContainerStarted","Data":"b893ea09ee5463171fc8ea80053d04640ec43ceccf6e1dde295d39c419f750d3"} Mar 19 09:51:22.660213 master-0 kubenswrapper[15202]: I0319 09:51:22.660167 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" event={"ID":"e2d72e05-7738-45d4-8b7a-2bfdb439e7f5","Type":"ContainerStarted","Data":"e497c0d4f3aed51dbd5adb9bedbd7fb292ddb40d21559956fac79f72c1d967a1"} Mar 19 09:51:22.663397 master-0 kubenswrapper[15202]: I0319 09:51:22.663136 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" podStartSLOduration=8.663112179 podStartE2EDuration="8.663112179s" podCreationTimestamp="2026-03-19 09:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:22.659611284 +0000 UTC m=+1600.045026110" watchObservedRunningTime="2026-03-19 09:51:22.663112179 +0000 UTC m=+1600.048527015" Mar 19 09:51:22.683327 master-0 kubenswrapper[15202]: I0319 09:51:22.682604 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" event={"ID":"9bf992bb-2aac-49c3-8135-6ab9f3a53193","Type":"ContainerStarted","Data":"e21b48d1e3578eaebde2202d33f40964d5e8764bd286dbbc50344769c036e6be"} Mar 19 09:51:22.750873 master-0 kubenswrapper[15202]: I0319 09:51:22.750804 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" podStartSLOduration=17.986129613 podStartE2EDuration="31.750780441s" podCreationTimestamp="2026-03-19 09:50:51 +0000 UTC" firstStartedPulling="2026-03-19 09:51:07.987994139 +0000 UTC m=+1585.373408955" lastFinishedPulling="2026-03-19 09:51:21.752644977 +0000 UTC m=+1599.138059783" observedRunningTime="2026-03-19 09:51:22.685651385 +0000 UTC m=+1600.071066201" watchObservedRunningTime="2026-03-19 09:51:22.750780441 +0000 UTC m=+1600.136195257" Mar 19 09:51:22.765921 master-0 kubenswrapper[15202]: I0319 09:51:22.765818 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" podStartSLOduration=1.86558752 podStartE2EDuration="56.765799161s" podCreationTimestamp="2026-03-19 09:50:26 +0000 UTC" firstStartedPulling="2026-03-19 09:50:27.023509423 +0000 UTC m=+1544.408924239" lastFinishedPulling="2026-03-19 09:51:21.923721064 +0000 UTC m=+1599.309135880" observedRunningTime="2026-03-19 09:51:22.718441144 +0000 UTC m=+1600.103855960" watchObservedRunningTime="2026-03-19 09:51:22.765799161 +0000 UTC m=+1600.151213977" Mar 19 09:51:22.832412 master-0 kubenswrapper[15202]: I0319 09:51:22.832000 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" path="/var/lib/kubelet/pods/aa538e33-3e22-45d9-8109-6aaba8e9ee52/volumes" Mar 19 09:51:23.713993 master-0 kubenswrapper[15202]: I0319 09:51:23.713918 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"19c324ae-8b95-430e-b544-90e2a4b5ff33","Type":"ContainerStarted","Data":"c734fd4d54f1b4c7366360129b3be32cf4dc3eb52370ac898b6412882342c45c"} Mar 19 09:51:23.731608 master-0 kubenswrapper[15202]: I0319 09:51:23.731548 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"a8885e16-c286-4619-9b4e-4d7ae54d5753","Type":"ContainerStarted","Data":"e008b9f9b07ae34f63998bc35569b46259da23c00f3c750499c0a6f4862ea4a0"} Mar 19 09:51:23.739135 master-0 kubenswrapper[15202]: I0319 09:51:23.739050 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701","Type":"ContainerStarted","Data":"3f51daa3ce99a014f950d0d92a17691f6a503613737be3d2f74d83785113c1e0"} Mar 19 09:51:23.739383 master-0 kubenswrapper[15202]: I0319 09:51:23.739244 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-api-0" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-7ba05-api-log" containerID="cri-o://fa1442e50632b2261b70cb5220b1069da73b666d271f0d741f14a123a3bc9d66" gracePeriod=30 Mar 19 09:51:23.739863 master-0 kubenswrapper[15202]: I0319 09:51:23.739458 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:23.739863 master-0 kubenswrapper[15202]: I0319 09:51:23.739512 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-api-0" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-api" containerID="cri-o://3f51daa3ce99a014f950d0d92a17691f6a503613737be3d2f74d83785113c1e0" gracePeriod=30 Mar 19 09:51:23.745445 master-0 kubenswrapper[15202]: I0319 09:51:23.745388 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77db675565-g4zz2" event={"ID":"1c78c47a-7a9a-4835-92ed-a3da198b2cc8","Type":"ContainerStarted","Data":"fa511a9f93856c5492911e69927b2166acf294add7e84ff9098c52e65bee033c"} Mar 19 09:51:23.746399 master-0 kubenswrapper[15202]: I0319 09:51:23.746363 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:23.751586 master-0 kubenswrapper[15202]: I0319 09:51:23.750147 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"9939df7e-1cba-4d74-95d7-524376a36627","Type":"ContainerStarted","Data":"2e4b8a0163a3376a870446b23c716df8eff0060d7686a913ba754944ccbb4e0e"} Mar 19 09:51:23.827830 master-0 kubenswrapper[15202]: I0319 09:51:23.827259 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-backup-0" podStartSLOduration=4.243405977 podStartE2EDuration="10.827232945s" podCreationTimestamp="2026-03-19 09:51:13 +0000 UTC" firstStartedPulling="2026-03-19 09:51:14.971985638 +0000 UTC m=+1592.357400454" lastFinishedPulling="2026-03-19 09:51:21.555812606 +0000 UTC m=+1598.941227422" observedRunningTime="2026-03-19 09:51:23.810229636 +0000 UTC m=+1601.195644462" watchObservedRunningTime="2026-03-19 09:51:23.827232945 +0000 UTC m=+1601.212647761" Mar 19 09:51:23.908918 master-0 kubenswrapper[15202]: I0319 09:51:23.908873 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:23.956266 master-0 kubenswrapper[15202]: I0319 09:51:23.956193 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:24.032494 master-0 kubenswrapper[15202]: I0319 09:51:24.032389 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" podStartSLOduration=4.089527335 podStartE2EDuration="11.032370121s" podCreationTimestamp="2026-03-19 09:51:13 +0000 UTC" firstStartedPulling="2026-03-19 09:51:14.980877308 +0000 UTC m=+1592.366292124" lastFinishedPulling="2026-03-19 09:51:21.923720094 +0000 UTC m=+1599.309134910" observedRunningTime="2026-03-19 09:51:24.00839592 +0000 UTC m=+1601.393810736" watchObservedRunningTime="2026-03-19 09:51:24.032370121 +0000 UTC m=+1601.417784937" Mar 19 09:51:24.122871 master-0 kubenswrapper[15202]: I0319 09:51:24.122752 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77db675565-g4zz2" podStartSLOduration=6.122728908 podStartE2EDuration="6.122728908s" podCreationTimestamp="2026-03-19 09:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:24.11917437 +0000 UTC m=+1601.504589186" watchObservedRunningTime="2026-03-19 09:51:24.122728908 +0000 UTC m=+1601.508143724" Mar 19 09:51:24.163342 master-0 kubenswrapper[15202]: I0319 09:51:24.160582 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-api-0" podStartSLOduration=10.160554011 podStartE2EDuration="10.160554011s" podCreationTimestamp="2026-03-19 09:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:24.149715963 +0000 UTC m=+1601.535130779" watchObservedRunningTime="2026-03-19 09:51:24.160554011 +0000 UTC m=+1601.545968837" Mar 19 09:51:24.775156 master-0 kubenswrapper[15202]: I0319 09:51:24.775101 15202 generic.go:334] "Generic (PLEG): container finished" podID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerID="3f51daa3ce99a014f950d0d92a17691f6a503613737be3d2f74d83785113c1e0" exitCode=0 Mar 19 09:51:24.775156 master-0 kubenswrapper[15202]: I0319 09:51:24.775150 15202 generic.go:334] "Generic (PLEG): container finished" podID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerID="fa1442e50632b2261b70cb5220b1069da73b666d271f0d741f14a123a3bc9d66" exitCode=143 Mar 19 09:51:24.775539 master-0 kubenswrapper[15202]: I0319 09:51:24.775195 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701","Type":"ContainerDied","Data":"3f51daa3ce99a014f950d0d92a17691f6a503613737be3d2f74d83785113c1e0"} Mar 19 09:51:24.775539 master-0 kubenswrapper[15202]: I0319 09:51:24.775228 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701","Type":"ContainerDied","Data":"fa1442e50632b2261b70cb5220b1069da73b666d271f0d741f14a123a3bc9d66"} Mar 19 09:51:24.787741 master-0 kubenswrapper[15202]: I0319 09:51:24.782890 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"9939df7e-1cba-4d74-95d7-524376a36627","Type":"ContainerStarted","Data":"9f75f7a49f0c64e8251eca0e1af7f9568a28cfb22959ddd5127b289d83248fb2"} Mar 19 09:51:24.990562 master-0 kubenswrapper[15202]: I0319 09:51:24.990407 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:25.056693 master-0 kubenswrapper[15202]: I0319 09:51:25.056451 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-etc-machine-id\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.056693 master-0 kubenswrapper[15202]: I0319 09:51:25.056615 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-combined-ca-bundle\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.057166 master-0 kubenswrapper[15202]: I0319 09:51:25.056791 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-logs\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.057166 master-0 kubenswrapper[15202]: I0319 09:51:25.057005 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-scripts\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.057166 master-0 kubenswrapper[15202]: I0319 09:51:25.057064 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.057166 master-0 kubenswrapper[15202]: I0319 09:51:25.057114 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data-custom\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.057420 master-0 kubenswrapper[15202]: I0319 09:51:25.057244 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7z9d\" (UniqueName: \"kubernetes.io/projected/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-kube-api-access-d7z9d\") pod \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\" (UID: \"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701\") " Mar 19 09:51:25.058895 master-0 kubenswrapper[15202]: I0319 09:51:25.058499 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-logs" (OuterVolumeSpecName: "logs") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:25.058895 master-0 kubenswrapper[15202]: I0319 09:51:25.058582 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:25.078157 master-0 kubenswrapper[15202]: I0319 09:51:25.076501 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-kube-api-access-d7z9d" (OuterVolumeSpecName: "kube-api-access-d7z9d") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "kube-api-access-d7z9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:25.079412 master-0 kubenswrapper[15202]: I0319 09:51:25.078939 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-scheduler-0" podStartSLOduration=6.14499706 podStartE2EDuration="13.078922878s" podCreationTimestamp="2026-03-19 09:51:12 +0000 UTC" firstStartedPulling="2026-03-19 09:51:14.971983218 +0000 UTC m=+1592.357398034" lastFinishedPulling="2026-03-19 09:51:21.905909036 +0000 UTC m=+1599.291323852" observedRunningTime="2026-03-19 09:51:24.836858391 +0000 UTC m=+1602.222273217" watchObservedRunningTime="2026-03-19 09:51:25.078922878 +0000 UTC m=+1602.464337684" Mar 19 09:51:25.134104 master-0 kubenswrapper[15202]: I0319 09:51:25.131521 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-scripts" (OuterVolumeSpecName: "scripts") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:25.145340 master-0 kubenswrapper[15202]: I0319 09:51:25.145275 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:25.167620 master-0 kubenswrapper[15202]: I0319 09:51:25.160336 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7z9d\" (UniqueName: \"kubernetes.io/projected/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-kube-api-access-d7z9d\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.167620 master-0 kubenswrapper[15202]: I0319 09:51:25.160409 15202 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.167620 master-0 kubenswrapper[15202]: I0319 09:51:25.160426 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.167620 master-0 kubenswrapper[15202]: I0319 09:51:25.160438 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.167620 master-0 kubenswrapper[15202]: I0319 09:51:25.160451 15202 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.168233 master-0 kubenswrapper[15202]: I0319 09:51:25.168148 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:25.213034 master-0 kubenswrapper[15202]: I0319 09:51:25.212144 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data" (OuterVolumeSpecName: "config-data") pod "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" (UID: "08a6db0f-1a0b-4dbe-bc17-bbdbf1751701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:25.262986 master-0 kubenswrapper[15202]: I0319 09:51:25.262045 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.262986 master-0 kubenswrapper[15202]: I0319 09:51:25.262092 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:25.428703 master-0 kubenswrapper[15202]: I0319 09:51:25.428619 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-849fd5d677-sdj8j" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.225:5353: i/o timeout" Mar 19 09:51:25.794765 master-0 kubenswrapper[15202]: I0319 09:51:25.794715 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:25.795745 master-0 kubenswrapper[15202]: I0319 09:51:25.795698 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"08a6db0f-1a0b-4dbe-bc17-bbdbf1751701","Type":"ContainerDied","Data":"09b4f82052fdd2094412a115980526cbccda26d268526aa6dc3bc88ea9b61944"} Mar 19 09:51:25.795951 master-0 kubenswrapper[15202]: I0319 09:51:25.795769 15202 scope.go:117] "RemoveContainer" containerID="3f51daa3ce99a014f950d0d92a17691f6a503613737be3d2f74d83785113c1e0" Mar 19 09:51:25.851513 master-0 kubenswrapper[15202]: I0319 09:51:25.850855 15202 scope.go:117] "RemoveContainer" containerID="fa1442e50632b2261b70cb5220b1069da73b666d271f0d741f14a123a3bc9d66" Mar 19 09:51:25.857944 master-0 kubenswrapper[15202]: I0319 09:51:25.857342 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:25.870541 master-0 kubenswrapper[15202]: I0319 09:51:25.870143 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:25.899332 master-0 kubenswrapper[15202]: I0319 09:51:25.899245 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:25.900370 master-0 kubenswrapper[15202]: E0319 09:51:25.900325 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-7ba05-api-log" Mar 19 09:51:25.900441 master-0 kubenswrapper[15202]: I0319 09:51:25.900388 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-7ba05-api-log" Mar 19 09:51:25.900441 master-0 kubenswrapper[15202]: E0319 09:51:25.900428 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="dnsmasq-dns" Mar 19 09:51:25.900598 master-0 kubenswrapper[15202]: I0319 09:51:25.900455 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="dnsmasq-dns" Mar 19 09:51:25.900598 master-0 kubenswrapper[15202]: E0319 09:51:25.900515 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="init" Mar 19 09:51:25.900598 master-0 kubenswrapper[15202]: I0319 09:51:25.900527 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="init" Mar 19 09:51:25.900598 master-0 kubenswrapper[15202]: E0319 09:51:25.900541 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-api" Mar 19 09:51:25.900598 master-0 kubenswrapper[15202]: I0319 09:51:25.900550 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-api" Mar 19 09:51:25.900844 master-0 kubenswrapper[15202]: I0319 09:51:25.900818 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa538e33-3e22-45d9-8109-6aaba8e9ee52" containerName="dnsmasq-dns" Mar 19 09:51:25.900910 master-0 kubenswrapper[15202]: I0319 09:51:25.900861 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-api" Mar 19 09:51:25.900910 master-0 kubenswrapper[15202]: I0319 09:51:25.900896 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" containerName="cinder-7ba05-api-log" Mar 19 09:51:25.925832 master-0 kubenswrapper[15202]: I0319 09:51:25.925698 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:25.927790 master-0 kubenswrapper[15202]: I0319 09:51:25.927550 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-api-config-data" Mar 19 09:51:25.930814 master-0 kubenswrapper[15202]: I0319 09:51:25.929964 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 09:51:25.932188 master-0 kubenswrapper[15202]: I0319 09:51:25.932127 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 09:51:25.956251 master-0 kubenswrapper[15202]: I0319 09:51:25.956062 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:26.085545 master-0 kubenswrapper[15202]: I0319 09:51:26.085374 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-public-tls-certs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085545 master-0 kubenswrapper[15202]: I0319 09:51:26.085522 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29cc00b0-1537-42ce-b8ce-918dea958cf9-etc-machine-id\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085562 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29cc00b0-1537-42ce-b8ce-918dea958cf9-logs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085602 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-scripts\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085633 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-config-data-custom\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085660 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62587\" (UniqueName: \"kubernetes.io/projected/29cc00b0-1537-42ce-b8ce-918dea958cf9-kube-api-access-62587\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085688 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-combined-ca-bundle\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085758 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-config-data\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.085802 master-0 kubenswrapper[15202]: I0319 09:51:26.085787 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-internal-tls-certs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189303 master-0 kubenswrapper[15202]: I0319 09:51:26.189202 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-config-data\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189566 master-0 kubenswrapper[15202]: I0319 09:51:26.189414 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-internal-tls-certs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189566 master-0 kubenswrapper[15202]: I0319 09:51:26.189457 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-public-tls-certs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189658 master-0 kubenswrapper[15202]: I0319 09:51:26.189583 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29cc00b0-1537-42ce-b8ce-918dea958cf9-etc-machine-id\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189658 master-0 kubenswrapper[15202]: I0319 09:51:26.189617 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29cc00b0-1537-42ce-b8ce-918dea958cf9-logs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189658 master-0 kubenswrapper[15202]: I0319 09:51:26.189642 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-scripts\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189748 master-0 kubenswrapper[15202]: I0319 09:51:26.189670 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-config-data-custom\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189748 master-0 kubenswrapper[15202]: I0319 09:51:26.189699 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62587\" (UniqueName: \"kubernetes.io/projected/29cc00b0-1537-42ce-b8ce-918dea958cf9-kube-api-access-62587\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189748 master-0 kubenswrapper[15202]: I0319 09:51:26.189730 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-combined-ca-bundle\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.189838 master-0 kubenswrapper[15202]: I0319 09:51:26.189763 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29cc00b0-1537-42ce-b8ce-918dea958cf9-etc-machine-id\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.190373 master-0 kubenswrapper[15202]: I0319 09:51:26.190349 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29cc00b0-1537-42ce-b8ce-918dea958cf9-logs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.195247 master-0 kubenswrapper[15202]: I0319 09:51:26.193857 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-config-data\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.195247 master-0 kubenswrapper[15202]: I0319 09:51:26.194371 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-scripts\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.195615 master-0 kubenswrapper[15202]: I0319 09:51:26.195251 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-combined-ca-bundle\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.200628 master-0 kubenswrapper[15202]: I0319 09:51:26.198724 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-public-tls-certs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.200628 master-0 kubenswrapper[15202]: I0319 09:51:26.198759 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-internal-tls-certs\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.203441 master-0 kubenswrapper[15202]: I0319 09:51:26.203383 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29cc00b0-1537-42ce-b8ce-918dea958cf9-config-data-custom\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.212991 master-0 kubenswrapper[15202]: I0319 09:51:26.212933 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62587\" (UniqueName: \"kubernetes.io/projected/29cc00b0-1537-42ce-b8ce-918dea958cf9-kube-api-access-62587\") pod \"cinder-7ba05-api-0\" (UID: \"29cc00b0-1537-42ce-b8ce-918dea958cf9\") " pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.271720 master-0 kubenswrapper[15202]: I0319 09:51:26.269896 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:26.609995 master-0 kubenswrapper[15202]: I0319 09:51:26.609854 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:51:26.722339 master-0 kubenswrapper[15202]: I0319 09:51:26.722259 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-api-0"] Mar 19 09:51:26.807753 master-0 kubenswrapper[15202]: I0319 09:51:26.807711 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"29cc00b0-1537-42ce-b8ce-918dea958cf9","Type":"ContainerStarted","Data":"2ed426c231cf02d2baa6b27aaf6bd0ab3bddb1c65325e2ea1233a99dd6e7685a"} Mar 19 09:51:26.832166 master-0 kubenswrapper[15202]: I0319 09:51:26.832104 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a6db0f-1a0b-4dbe-bc17-bbdbf1751701" path="/var/lib/kubelet/pods/08a6db0f-1a0b-4dbe-bc17-bbdbf1751701/volumes" Mar 19 09:51:26.980434 master-0 kubenswrapper[15202]: I0319 09:51:26.980237 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:51:26.982277 master-0 kubenswrapper[15202]: I0319 09:51:26.982236 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:51:26.984173 master-0 kubenswrapper[15202]: I0319 09:51:26.984139 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:51:27.823398 master-0 kubenswrapper[15202]: I0319 09:51:27.823233 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"29cc00b0-1537-42ce-b8ce-918dea958cf9","Type":"ContainerStarted","Data":"af46526fdbe6d1636ff5dd48a12b002fddc7167206ddad47a2cf5cd421f9739d"} Mar 19 09:51:27.831669 master-0 kubenswrapper[15202]: I0319 09:51:27.831524 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5" Mar 19 09:51:28.061663 master-0 kubenswrapper[15202]: I0319 09:51:28.061577 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-lfsjb"] Mar 19 09:51:28.063767 master-0 kubenswrapper[15202]: I0319 09:51:28.063723 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.078249 master-0 kubenswrapper[15202]: I0319 09:51:28.077380 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-lfsjb"] Mar 19 09:51:28.152069 master-0 kubenswrapper[15202]: I0319 09:51:28.151997 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/18d2318b-1a1b-45a3-9b06-ea750daf9e17-image-data\") pod \"edpm-a-provisionserver-checksum-discovery-lfsjb\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.152307 master-0 kubenswrapper[15202]: I0319 09:51:28.152201 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v5bq\" (UniqueName: \"kubernetes.io/projected/18d2318b-1a1b-45a3-9b06-ea750daf9e17-kube-api-access-5v5bq\") pod \"edpm-a-provisionserver-checksum-discovery-lfsjb\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.254341 master-0 kubenswrapper[15202]: I0319 09:51:28.254208 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/18d2318b-1a1b-45a3-9b06-ea750daf9e17-image-data\") pod \"edpm-a-provisionserver-checksum-discovery-lfsjb\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.254553 master-0 kubenswrapper[15202]: I0319 09:51:28.254397 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v5bq\" (UniqueName: \"kubernetes.io/projected/18d2318b-1a1b-45a3-9b06-ea750daf9e17-kube-api-access-5v5bq\") pod \"edpm-a-provisionserver-checksum-discovery-lfsjb\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.255769 master-0 kubenswrapper[15202]: I0319 09:51:28.255739 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/18d2318b-1a1b-45a3-9b06-ea750daf9e17-image-data\") pod \"edpm-a-provisionserver-checksum-discovery-lfsjb\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.286971 master-0 kubenswrapper[15202]: I0319 09:51:28.286917 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v5bq\" (UniqueName: \"kubernetes.io/projected/18d2318b-1a1b-45a3-9b06-ea750daf9e17-kube-api-access-5v5bq\") pod \"edpm-a-provisionserver-checksum-discovery-lfsjb\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.388705 master-0 kubenswrapper[15202]: I0319 09:51:28.388592 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:28.842033 master-0 kubenswrapper[15202]: I0319 09:51:28.841904 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-api-0" event={"ID":"29cc00b0-1537-42ce-b8ce-918dea958cf9","Type":"ContainerStarted","Data":"053f1cc874c941b0c89f27a330a505232765dfec030d27efd502ee805fa57dbf"} Mar 19 09:51:28.842539 master-0 kubenswrapper[15202]: I0319 09:51:28.842201 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:28.901535 master-0 kubenswrapper[15202]: I0319 09:51:28.901434 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-lfsjb"] Mar 19 09:51:28.903970 master-0 kubenswrapper[15202]: I0319 09:51:28.903814 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-api-0" podStartSLOduration=3.903792329 podStartE2EDuration="3.903792329s" podCreationTimestamp="2026-03-19 09:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:28.88721484 +0000 UTC m=+1606.272629656" watchObservedRunningTime="2026-03-19 09:51:28.903792329 +0000 UTC m=+1606.289207145" Mar 19 09:51:29.072577 master-0 kubenswrapper[15202]: I0319 09:51:29.072008 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:29.177917 master-0 kubenswrapper[15202]: I0319 09:51:29.177849 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:29.191842 master-0 kubenswrapper[15202]: I0319 09:51:29.189156 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:29.258956 master-0 kubenswrapper[15202]: I0319 09:51:29.258784 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:29.311637 master-0 kubenswrapper[15202]: I0319 09:51:29.311445 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:29.399141 master-0 kubenswrapper[15202]: I0319 09:51:29.398920 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:29.555646 master-0 kubenswrapper[15202]: I0319 09:51:29.555612 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:51:29.857349 master-0 kubenswrapper[15202]: I0319 09:51:29.857217 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" event={"ID":"18d2318b-1a1b-45a3-9b06-ea750daf9e17","Type":"ContainerStarted","Data":"b351eff4c20bb908b338fde6cdda00d87e674e8f9831fe42719e7a444cd88df8"} Mar 19 09:51:29.857349 master-0 kubenswrapper[15202]: I0319 09:51:29.857278 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" event={"ID":"18d2318b-1a1b-45a3-9b06-ea750daf9e17","Type":"ContainerStarted","Data":"f478d50c419fe4e245be20df9ad42de75814761db2769d84704dfa5a770731ba"} Mar 19 09:51:29.858415 master-0 kubenswrapper[15202]: I0319 09:51:29.858137 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-backup-0" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="cinder-backup" containerID="cri-o://7bc8077f80a1a873040b36173eedf2d6f180726ba0a067e4a115c34e9692097e" gracePeriod=30 Mar 19 09:51:29.861074 master-0 kubenswrapper[15202]: I0319 09:51:29.859245 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="cinder-volume" containerID="cri-o://b893ea09ee5463171fc8ea80053d04640ec43ceccf6e1dde295d39c419f750d3" gracePeriod=30 Mar 19 09:51:29.861074 master-0 kubenswrapper[15202]: I0319 09:51:29.859331 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-backup-0" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="probe" containerID="cri-o://c734fd4d54f1b4c7366360129b3be32cf4dc3eb52370ac898b6412882342c45c" gracePeriod=30 Mar 19 09:51:29.861074 master-0 kubenswrapper[15202]: I0319 09:51:29.859397 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="probe" containerID="cri-o://e008b9f9b07ae34f63998bc35569b46259da23c00f3c750499c0a6f4862ea4a0" gracePeriod=30 Mar 19 09:51:30.315150 master-0 kubenswrapper[15202]: I0319 09:51:30.306590 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb6bf676c-xlvsw"] Mar 19 09:51:30.315150 master-0 kubenswrapper[15202]: I0319 09:51:30.306883 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="dnsmasq-dns" containerID="cri-o://9941eb25adc16021af551ad625b756cead27849d221b0f4deee8036d26ddd3fa" gracePeriod=10 Mar 19 09:51:30.398277 master-0 kubenswrapper[15202]: I0319 09:51:30.398217 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:30.875006 master-0 kubenswrapper[15202]: I0319 09:51:30.874949 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"a8885e16-c286-4619-9b4e-4d7ae54d5753","Type":"ContainerDied","Data":"e008b9f9b07ae34f63998bc35569b46259da23c00f3c750499c0a6f4862ea4a0"} Mar 19 09:51:30.876799 master-0 kubenswrapper[15202]: I0319 09:51:30.875995 15202 generic.go:334] "Generic (PLEG): container finished" podID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerID="e008b9f9b07ae34f63998bc35569b46259da23c00f3c750499c0a6f4862ea4a0" exitCode=0 Mar 19 09:51:30.876799 master-0 kubenswrapper[15202]: I0319 09:51:30.876095 15202 generic.go:334] "Generic (PLEG): container finished" podID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerID="b893ea09ee5463171fc8ea80053d04640ec43ceccf6e1dde295d39c419f750d3" exitCode=0 Mar 19 09:51:30.876799 master-0 kubenswrapper[15202]: I0319 09:51:30.876274 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"a8885e16-c286-4619-9b4e-4d7ae54d5753","Type":"ContainerDied","Data":"b893ea09ee5463171fc8ea80053d04640ec43ceccf6e1dde295d39c419f750d3"} Mar 19 09:51:30.880881 master-0 kubenswrapper[15202]: I0319 09:51:30.880820 15202 generic.go:334] "Generic (PLEG): container finished" podID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerID="9941eb25adc16021af551ad625b756cead27849d221b0f4deee8036d26ddd3fa" exitCode=0 Mar 19 09:51:30.881232 master-0 kubenswrapper[15202]: I0319 09:51:30.881032 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-scheduler-0" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="cinder-scheduler" containerID="cri-o://2e4b8a0163a3376a870446b23c716df8eff0060d7686a913ba754944ccbb4e0e" gracePeriod=30 Mar 19 09:51:30.881509 master-0 kubenswrapper[15202]: I0319 09:51:30.881285 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" event={"ID":"76d834e7-4d24-4e34-8ebd-b71c80766e40","Type":"ContainerDied","Data":"9941eb25adc16021af551ad625b756cead27849d221b0f4deee8036d26ddd3fa"} Mar 19 09:51:30.881509 master-0 kubenswrapper[15202]: I0319 09:51:30.881315 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" event={"ID":"76d834e7-4d24-4e34-8ebd-b71c80766e40","Type":"ContainerDied","Data":"a60a2f6ac5cc3ceb3c2ba6142fefb082d56944de0cc0fa9989230c1facd37f39"} Mar 19 09:51:30.881509 master-0 kubenswrapper[15202]: I0319 09:51:30.881325 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a60a2f6ac5cc3ceb3c2ba6142fefb082d56944de0cc0fa9989230c1facd37f39" Mar 19 09:51:30.881721 master-0 kubenswrapper[15202]: I0319 09:51:30.881662 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7ba05-scheduler-0" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="probe" containerID="cri-o://9f75f7a49f0c64e8251eca0e1af7f9568a28cfb22959ddd5127b289d83248fb2" gracePeriod=30 Mar 19 09:51:30.975288 master-0 kubenswrapper[15202]: I0319 09:51:30.974960 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.055789 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-svc\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.055866 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-nb\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.055950 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-config\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.056032 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbck\" (UniqueName: \"kubernetes.io/projected/76d834e7-4d24-4e34-8ebd-b71c80766e40-kube-api-access-jdbck\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.056060 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-sb\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.056263 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-swift-storage-0\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.056497 master-0 kubenswrapper[15202]: I0319 09:51:31.056353 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-edpm-a\") pod \"76d834e7-4d24-4e34-8ebd-b71c80766e40\" (UID: \"76d834e7-4d24-4e34-8ebd-b71c80766e40\") " Mar 19 09:51:31.117130 master-0 kubenswrapper[15202]: I0319 09:51:31.116839 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d834e7-4d24-4e34-8ebd-b71c80766e40-kube-api-access-jdbck" (OuterVolumeSpecName: "kube-api-access-jdbck") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "kube-api-access-jdbck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:31.175499 master-0 kubenswrapper[15202]: I0319 09:51:31.160194 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbck\" (UniqueName: \"kubernetes.io/projected/76d834e7-4d24-4e34-8ebd-b71c80766e40-kube-api-access-jdbck\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.195502 master-0 kubenswrapper[15202]: I0319 09:51:31.179959 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:31.214332 master-0 kubenswrapper[15202]: I0319 09:51:31.214259 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:31.218687 master-0 kubenswrapper[15202]: I0319 09:51:31.218544 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:31.238109 master-0 kubenswrapper[15202]: I0319 09:51:31.236276 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-config" (OuterVolumeSpecName: "config") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:31.263337 master-0 kubenswrapper[15202]: I0319 09:51:31.256128 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:31.271116 master-0 kubenswrapper[15202]: I0319 09:51:31.264948 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.271116 master-0 kubenswrapper[15202]: I0319 09:51:31.264993 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.271116 master-0 kubenswrapper[15202]: I0319 09:51:31.265005 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.271116 master-0 kubenswrapper[15202]: I0319 09:51:31.265015 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.271116 master-0 kubenswrapper[15202]: I0319 09:51:31.265027 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.315653 master-0 kubenswrapper[15202]: I0319 09:51:31.307215 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "76d834e7-4d24-4e34-8ebd-b71c80766e40" (UID: "76d834e7-4d24-4e34-8ebd-b71c80766e40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:31.368804 master-0 kubenswrapper[15202]: I0319 09:51:31.368743 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76d834e7-4d24-4e34-8ebd-b71c80766e40-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.391226 master-0 kubenswrapper[15202]: I0319 09:51:31.391158 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:31.470820 master-0 kubenswrapper[15202]: I0319 09:51:31.470742 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471090 master-0 kubenswrapper[15202]: I0319 09:51:31.470836 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-lib-modules\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471090 master-0 kubenswrapper[15202]: I0319 09:51:31.470936 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-cinder\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471090 master-0 kubenswrapper[15202]: I0319 09:51:31.470968 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnhg2\" (UniqueName: \"kubernetes.io/projected/a8885e16-c286-4619-9b4e-4d7ae54d5753-kube-api-access-cnhg2\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471090 master-0 kubenswrapper[15202]: I0319 09:51:31.471053 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-iscsi\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471090 master-0 kubenswrapper[15202]: I0319 09:51:31.471079 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-dev\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471344 master-0 kubenswrapper[15202]: I0319 09:51:31.471246 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-brick\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471344 master-0 kubenswrapper[15202]: I0319 09:51:31.471322 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-machine-id\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471435 master-0 kubenswrapper[15202]: I0319 09:51:31.471364 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-nvme\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471435 master-0 kubenswrapper[15202]: I0319 09:51:31.471389 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-run\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471557 master-0 kubenswrapper[15202]: I0319 09:51:31.471439 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data-custom\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.471557 master-0 kubenswrapper[15202]: I0319 09:51:31.471476 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-combined-ca-bundle\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472025 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-dev" (OuterVolumeSpecName: "dev") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472024 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-sys\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472102 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-scripts\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472131 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-sys" (OuterVolumeSpecName: "sys") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472184 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472139 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472219 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.472590 master-0 kubenswrapper[15202]: I0319 09:51:31.472156 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-lib-cinder\") pod \"a8885e16-c286-4619-9b4e-4d7ae54d5753\" (UID: \"a8885e16-c286-4619-9b4e-4d7ae54d5753\") " Mar 19 09:51:31.473842 master-0 kubenswrapper[15202]: I0319 09:51:31.473813 15202 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.473842 master-0 kubenswrapper[15202]: I0319 09:51:31.473836 15202 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.473842 master-0 kubenswrapper[15202]: I0319 09:51:31.473847 15202 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-dev\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.473857 15202 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-sys\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.473867 15202 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.473912 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.473939 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.473961 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.473981 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.474001 master-0 kubenswrapper[15202]: I0319 09:51:31.474000 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-run" (OuterVolumeSpecName: "run") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:31.480112 master-0 kubenswrapper[15202]: I0319 09:51:31.480061 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8885e16-c286-4619-9b4e-4d7ae54d5753-kube-api-access-cnhg2" (OuterVolumeSpecName: "kube-api-access-cnhg2") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "kube-api-access-cnhg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:31.482981 master-0 kubenswrapper[15202]: I0319 09:51:31.482920 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-scripts" (OuterVolumeSpecName: "scripts") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:31.491555 master-0 kubenswrapper[15202]: I0319 09:51:31.488683 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575774 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnhg2\" (UniqueName: \"kubernetes.io/projected/a8885e16-c286-4619-9b4e-4d7ae54d5753-kube-api-access-cnhg2\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575833 15202 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575844 15202 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575854 15202 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575864 15202 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575872 15202 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8885e16-c286-4619-9b4e-4d7ae54d5753-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575881 15202 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.575929 master-0 kubenswrapper[15202]: I0319 09:51:31.575890 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.609869 master-0 kubenswrapper[15202]: I0319 09:51:31.609812 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:51:31.613432 master-0 kubenswrapper[15202]: I0319 09:51:31.613354 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:31.613432 master-0 kubenswrapper[15202]: I0319 09:51:31.614293 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:51:31.649223 master-0 kubenswrapper[15202]: I0319 09:51:31.649138 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data" (OuterVolumeSpecName: "config-data") pod "a8885e16-c286-4619-9b4e-4d7ae54d5753" (UID: "a8885e16-c286-4619-9b4e-4d7ae54d5753"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:31.682566 master-0 kubenswrapper[15202]: I0319 09:51:31.682505 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.682734 master-0 kubenswrapper[15202]: I0319 09:51:31.682617 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8885e16-c286-4619-9b4e-4d7ae54d5753-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:31.895051 master-0 kubenswrapper[15202]: I0319 09:51:31.894971 15202 generic.go:334] "Generic (PLEG): container finished" podID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerID="c734fd4d54f1b4c7366360129b3be32cf4dc3eb52370ac898b6412882342c45c" exitCode=0 Mar 19 09:51:31.895051 master-0 kubenswrapper[15202]: I0319 09:51:31.895023 15202 generic.go:334] "Generic (PLEG): container finished" podID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerID="7bc8077f80a1a873040b36173eedf2d6f180726ba0a067e4a115c34e9692097e" exitCode=0 Mar 19 09:51:31.895051 master-0 kubenswrapper[15202]: I0319 09:51:31.895055 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"19c324ae-8b95-430e-b544-90e2a4b5ff33","Type":"ContainerDied","Data":"c734fd4d54f1b4c7366360129b3be32cf4dc3eb52370ac898b6412882342c45c"} Mar 19 09:51:31.895850 master-0 kubenswrapper[15202]: I0319 09:51:31.895099 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"19c324ae-8b95-430e-b544-90e2a4b5ff33","Type":"ContainerDied","Data":"7bc8077f80a1a873040b36173eedf2d6f180726ba0a067e4a115c34e9692097e"} Mar 19 09:51:31.897692 master-0 kubenswrapper[15202]: I0319 09:51:31.897596 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"a8885e16-c286-4619-9b4e-4d7ae54d5753","Type":"ContainerDied","Data":"0a288268ee6e842f4f0be884ed78277d2d4f2ecb7533e8bc4d3c83a7bde6a5bd"} Mar 19 09:51:31.897692 master-0 kubenswrapper[15202]: I0319 09:51:31.897668 15202 scope.go:117] "RemoveContainer" containerID="e008b9f9b07ae34f63998bc35569b46259da23c00f3c750499c0a6f4862ea4a0" Mar 19 09:51:31.897802 master-0 kubenswrapper[15202]: I0319 09:51:31.897607 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:31.904945 master-0 kubenswrapper[15202]: I0319 09:51:31.904845 15202 generic.go:334] "Generic (PLEG): container finished" podID="9939df7e-1cba-4d74-95d7-524376a36627" containerID="9f75f7a49f0c64e8251eca0e1af7f9568a28cfb22959ddd5127b289d83248fb2" exitCode=0 Mar 19 09:51:31.905116 master-0 kubenswrapper[15202]: I0319 09:51:31.904926 15202 generic.go:334] "Generic (PLEG): container finished" podID="9939df7e-1cba-4d74-95d7-524376a36627" containerID="2e4b8a0163a3376a870446b23c716df8eff0060d7686a913ba754944ccbb4e0e" exitCode=0 Mar 19 09:51:31.905116 master-0 kubenswrapper[15202]: I0319 09:51:31.905049 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" Mar 19 09:51:31.905985 master-0 kubenswrapper[15202]: I0319 09:51:31.904927 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"9939df7e-1cba-4d74-95d7-524376a36627","Type":"ContainerDied","Data":"9f75f7a49f0c64e8251eca0e1af7f9568a28cfb22959ddd5127b289d83248fb2"} Mar 19 09:51:31.906048 master-0 kubenswrapper[15202]: I0319 09:51:31.905998 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"9939df7e-1cba-4d74-95d7-524376a36627","Type":"ContainerDied","Data":"2e4b8a0163a3376a870446b23c716df8eff0060d7686a913ba754944ccbb4e0e"} Mar 19 09:51:31.913712 master-0 kubenswrapper[15202]: I0319 09:51:31.913658 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm" Mar 19 09:51:31.969898 master-0 kubenswrapper[15202]: I0319 09:51:31.969826 15202 scope.go:117] "RemoveContainer" containerID="b893ea09ee5463171fc8ea80053d04640ec43ceccf6e1dde295d39c419f750d3" Mar 19 09:51:32.064512 master-0 kubenswrapper[15202]: I0319 09:51:32.062585 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:32.093806 master-0 kubenswrapper[15202]: I0319 09:51:32.089679 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:32.141575 master-0 kubenswrapper[15202]: I0319 09:51:32.141373 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb6bf676c-xlvsw"] Mar 19 09:51:32.172615 master-0 kubenswrapper[15202]: I0319 09:51:32.172559 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:32.173132 master-0 kubenswrapper[15202]: E0319 09:51:32.173111 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="init" Mar 19 09:51:32.173132 master-0 kubenswrapper[15202]: I0319 09:51:32.173131 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="init" Mar 19 09:51:32.173267 master-0 kubenswrapper[15202]: E0319 09:51:32.173151 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="probe" Mar 19 09:51:32.173267 master-0 kubenswrapper[15202]: I0319 09:51:32.173159 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="probe" Mar 19 09:51:32.173267 master-0 kubenswrapper[15202]: E0319 09:51:32.173185 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="dnsmasq-dns" Mar 19 09:51:32.173267 master-0 kubenswrapper[15202]: I0319 09:51:32.173191 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="dnsmasq-dns" Mar 19 09:51:32.173267 master-0 kubenswrapper[15202]: E0319 09:51:32.173227 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="cinder-volume" Mar 19 09:51:32.173267 master-0 kubenswrapper[15202]: I0319 09:51:32.173234 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="cinder-volume" Mar 19 09:51:32.175393 master-0 kubenswrapper[15202]: I0319 09:51:32.175354 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="dnsmasq-dns" Mar 19 09:51:32.175393 master-0 kubenswrapper[15202]: I0319 09:51:32.175384 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="cinder-volume" Mar 19 09:51:32.175536 master-0 kubenswrapper[15202]: I0319 09:51:32.175415 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" containerName="probe" Mar 19 09:51:32.185538 master-0 kubenswrapper[15202]: I0319 09:51:32.178160 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.185538 master-0 kubenswrapper[15202]: I0319 09:51:32.180370 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-volume-lvm-iscsi-config-data" Mar 19 09:51:32.205030 master-0 kubenswrapper[15202]: I0319 09:51:32.204592 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-config-data-custom\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205030 master-0 kubenswrapper[15202]: I0319 09:51:32.204817 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-config-data\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205030 master-0 kubenswrapper[15202]: I0319 09:51:32.204952 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-run\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205327 master-0 kubenswrapper[15202]: I0319 09:51:32.205080 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-iscsi\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205327 master-0 kubenswrapper[15202]: I0319 09:51:32.205159 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-combined-ca-bundle\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205327 master-0 kubenswrapper[15202]: I0319 09:51:32.205198 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-lib-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205327 master-0 kubenswrapper[15202]: I0319 09:51:32.205248 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-nvme\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205501 master-0 kubenswrapper[15202]: I0319 09:51:32.205432 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-locks-brick\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205501 master-0 kubenswrapper[15202]: I0319 09:51:32.205493 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-sys\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205593 master-0 kubenswrapper[15202]: I0319 09:51:32.205543 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp6hd\" (UniqueName: \"kubernetes.io/projected/c8a687b2-1e3a-4510-af7c-34277b366455-kube-api-access-gp6hd\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205634 master-0 kubenswrapper[15202]: I0319 09:51:32.205605 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-machine-id\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205816 master-0 kubenswrapper[15202]: I0319 09:51:32.205756 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-locks-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.205871 master-0 kubenswrapper[15202]: I0319 09:51:32.205840 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-lib-modules\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.206028 master-0 kubenswrapper[15202]: I0319 09:51:32.205972 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-dev\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.206104 master-0 kubenswrapper[15202]: I0319 09:51:32.206051 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-scripts\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.208147 master-0 kubenswrapper[15202]: I0319 09:51:32.208098 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb6bf676c-xlvsw"] Mar 19 09:51:32.241831 master-0 kubenswrapper[15202]: I0319 09:51:32.241762 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:32.253767 master-0 kubenswrapper[15202]: I0319 09:51:32.253678 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-x7j8z"] Mar 19 09:51:32.256820 master-0 kubenswrapper[15202]: I0319 09:51:32.256775 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.269548 master-0 kubenswrapper[15202]: I0319 09:51:32.269490 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-x7j8z"] Mar 19 09:51:32.309057 master-0 kubenswrapper[15202]: I0319 09:51:32.308893 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp6hd\" (UniqueName: \"kubernetes.io/projected/c8a687b2-1e3a-4510-af7c-34277b366455-kube-api-access-gp6hd\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309057 master-0 kubenswrapper[15202]: I0319 09:51:32.308976 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-machine-id\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309057 master-0 kubenswrapper[15202]: I0319 09:51:32.309050 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-locks-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309092 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-lib-modules\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309144 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-dev\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309193 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-image-data\") pod \"edpm-b-provisionserver-checksum-discovery-x7j8z\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309233 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-scripts\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309295 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-config-data-custom\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309326 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-config-data\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309358 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-run\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309385 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tptlr\" (UniqueName: \"kubernetes.io/projected/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-kube-api-access-tptlr\") pod \"edpm-b-provisionserver-checksum-discovery-x7j8z\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309441 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-iscsi\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309478 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-combined-ca-bundle\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309522 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-nvme\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.309545 master-0 kubenswrapper[15202]: I0319 09:51:32.309541 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-lib-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.309580 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-locks-brick\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.309601 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-sys\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.309714 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-sys\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.309755 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-run\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.309789 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-iscsi\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310351 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-dev\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310423 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-machine-id\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310525 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-locks-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310554 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-lib-modules\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310591 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-lib-cinder\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310630 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-etc-nvme\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.311457 master-0 kubenswrapper[15202]: I0319 09:51:32.310671 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c8a687b2-1e3a-4510-af7c-34277b366455-var-locks-brick\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.313947 master-0 kubenswrapper[15202]: I0319 09:51:32.313861 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-scripts\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.316368 master-0 kubenswrapper[15202]: I0319 09:51:32.316333 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-config-data-custom\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.316758 master-0 kubenswrapper[15202]: I0319 09:51:32.316688 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-config-data\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.319936 master-0 kubenswrapper[15202]: I0319 09:51:32.318800 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a687b2-1e3a-4510-af7c-34277b366455-combined-ca-bundle\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.319936 master-0 kubenswrapper[15202]: I0319 09:51:32.319322 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:32.340270 master-0 kubenswrapper[15202]: I0319 09:51:32.340199 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp6hd\" (UniqueName: \"kubernetes.io/projected/c8a687b2-1e3a-4510-af7c-34277b366455-kube-api-access-gp6hd\") pod \"cinder-7ba05-volume-lvm-iscsi-0\" (UID: \"c8a687b2-1e3a-4510-af7c-34277b366455\") " pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411318 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-sys\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411412 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-lib-cinder\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411437 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-iscsi\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411566 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-dev\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411604 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411675 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-scripts\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411694 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-dev" (OuterVolumeSpecName: "dev") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411707 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxsb8\" (UniqueName: \"kubernetes.io/projected/19c324ae-8b95-430e-b544-90e2a4b5ff33-kube-api-access-rxsb8\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411724 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411755 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-combined-ca-bundle\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.411967 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412122 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data-custom\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412239 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-brick\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412451 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-nvme\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412295 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-sys" (OuterVolumeSpecName: "sys") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412600 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-lib-modules\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412633 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-machine-id\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412625 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412685 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-run" (OuterVolumeSpecName: "run") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412654 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-run\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412717 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412742 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412742 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-cinder\") pod \"19c324ae-8b95-430e-b544-90e2a4b5ff33\" (UID: \"19c324ae-8b95-430e-b544-90e2a4b5ff33\") " Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412766 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.413034 master-0 kubenswrapper[15202]: I0319 09:51:32.412940 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.414084 master-0 kubenswrapper[15202]: I0319 09:51:32.413569 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-image-data\") pod \"edpm-b-provisionserver-checksum-discovery-x7j8z\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.414084 master-0 kubenswrapper[15202]: I0319 09:51:32.413806 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tptlr\" (UniqueName: \"kubernetes.io/projected/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-kube-api-access-tptlr\") pod \"edpm-b-provisionserver-checksum-discovery-x7j8z\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414207 15202 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414230 15202 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414240 15202 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414249 15202 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414258 15202 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-sys\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414266 15202 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414276 15202 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414285 15202 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-dev\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414294 15202 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414304 15202 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/19c324ae-8b95-430e-b544-90e2a4b5ff33-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.417426 master-0 kubenswrapper[15202]: I0319 09:51:32.414336 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-image-data\") pod \"edpm-b-provisionserver-checksum-discovery-x7j8z\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.436009 master-0 kubenswrapper[15202]: I0319 09:51:32.435948 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-scripts" (OuterVolumeSpecName: "scripts") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:32.445021 master-0 kubenswrapper[15202]: I0319 09:51:32.444970 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:32.497560 master-0 kubenswrapper[15202]: I0319 09:51:32.492056 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c324ae-8b95-430e-b544-90e2a4b5ff33-kube-api-access-rxsb8" (OuterVolumeSpecName: "kube-api-access-rxsb8") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "kube-api-access-rxsb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:32.497560 master-0 kubenswrapper[15202]: I0319 09:51:32.493691 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tptlr\" (UniqueName: \"kubernetes.io/projected/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-kube-api-access-tptlr\") pod \"edpm-b-provisionserver-checksum-discovery-x7j8z\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.509161 master-0 kubenswrapper[15202]: I0319 09:51:32.509106 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:32.521534 master-0 kubenswrapper[15202]: I0319 09:51:32.521455 15202 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.521534 master-0 kubenswrapper[15202]: I0319 09:51:32.521524 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.521534 master-0 kubenswrapper[15202]: I0319 09:51:32.521536 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxsb8\" (UniqueName: \"kubernetes.io/projected/19c324ae-8b95-430e-b544-90e2a4b5ff33-kube-api-access-rxsb8\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.606890 master-0 kubenswrapper[15202]: I0319 09:51:32.605398 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:32.625507 master-0 kubenswrapper[15202]: I0319 09:51:32.623387 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.754763 master-0 kubenswrapper[15202]: I0319 09:51:32.747378 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:32.771505 master-0 kubenswrapper[15202]: I0319 09:51:32.768942 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data" (OuterVolumeSpecName: "config-data") pod "19c324ae-8b95-430e-b544-90e2a4b5ff33" (UID: "19c324ae-8b95-430e-b544-90e2a4b5ff33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:32.789509 master-0 kubenswrapper[15202]: I0319 09:51:32.782622 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:32.832508 master-0 kubenswrapper[15202]: I0319 09:51:32.832195 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-scripts\") pod \"9939df7e-1cba-4d74-95d7-524376a36627\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " Mar 19 09:51:32.832508 master-0 kubenswrapper[15202]: I0319 09:51:32.832473 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data\") pod \"9939df7e-1cba-4d74-95d7-524376a36627\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " Mar 19 09:51:32.832845 master-0 kubenswrapper[15202]: I0319 09:51:32.832648 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr4tx\" (UniqueName: \"kubernetes.io/projected/9939df7e-1cba-4d74-95d7-524376a36627-kube-api-access-tr4tx\") pod \"9939df7e-1cba-4d74-95d7-524376a36627\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " Mar 19 09:51:32.832845 master-0 kubenswrapper[15202]: I0319 09:51:32.832684 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9939df7e-1cba-4d74-95d7-524376a36627-etc-machine-id\") pod \"9939df7e-1cba-4d74-95d7-524376a36627\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " Mar 19 09:51:32.832845 master-0 kubenswrapper[15202]: I0319 09:51:32.832720 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data-custom\") pod \"9939df7e-1cba-4d74-95d7-524376a36627\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " Mar 19 09:51:32.832845 master-0 kubenswrapper[15202]: I0319 09:51:32.832783 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-combined-ca-bundle\") pod \"9939df7e-1cba-4d74-95d7-524376a36627\" (UID: \"9939df7e-1cba-4d74-95d7-524376a36627\") " Mar 19 09:51:32.844509 master-0 kubenswrapper[15202]: I0319 09:51:32.833356 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c324ae-8b95-430e-b544-90e2a4b5ff33-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.844509 master-0 kubenswrapper[15202]: I0319 09:51:32.837812 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9939df7e-1cba-4d74-95d7-524376a36627-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9939df7e-1cba-4d74-95d7-524376a36627" (UID: "9939df7e-1cba-4d74-95d7-524376a36627"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:51:32.857507 master-0 kubenswrapper[15202]: I0319 09:51:32.853831 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9939df7e-1cba-4d74-95d7-524376a36627-kube-api-access-tr4tx" (OuterVolumeSpecName: "kube-api-access-tr4tx") pod "9939df7e-1cba-4d74-95d7-524376a36627" (UID: "9939df7e-1cba-4d74-95d7-524376a36627"). InnerVolumeSpecName "kube-api-access-tr4tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:32.857507 master-0 kubenswrapper[15202]: I0319 09:51:32.854782 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9939df7e-1cba-4d74-95d7-524376a36627" (UID: "9939df7e-1cba-4d74-95d7-524376a36627"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:32.898536 master-0 kubenswrapper[15202]: I0319 09:51:32.875728 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-scripts" (OuterVolumeSpecName: "scripts") pod "9939df7e-1cba-4d74-95d7-524376a36627" (UID: "9939df7e-1cba-4d74-95d7-524376a36627"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:32.939508 master-0 kubenswrapper[15202]: I0319 09:51:32.922408 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" path="/var/lib/kubelet/pods/76d834e7-4d24-4e34-8ebd-b71c80766e40/volumes" Mar 19 09:51:32.939508 master-0 kubenswrapper[15202]: I0319 09:51:32.923179 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8885e16-c286-4619-9b4e-4d7ae54d5753" path="/var/lib/kubelet/pods/a8885e16-c286-4619-9b4e-4d7ae54d5753/volumes" Mar 19 09:51:32.939508 master-0 kubenswrapper[15202]: I0319 09:51:32.935462 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr4tx\" (UniqueName: \"kubernetes.io/projected/9939df7e-1cba-4d74-95d7-524376a36627-kube-api-access-tr4tx\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.939508 master-0 kubenswrapper[15202]: I0319 09:51:32.935515 15202 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9939df7e-1cba-4d74-95d7-524376a36627-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.939508 master-0 kubenswrapper[15202]: I0319 09:51:32.935526 15202 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:32.939508 master-0 kubenswrapper[15202]: I0319 09:51:32.935536 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:33.016530 master-0 kubenswrapper[15202]: I0319 09:51:33.004283 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"19c324ae-8b95-430e-b544-90e2a4b5ff33","Type":"ContainerDied","Data":"8f79868713decfab9207d90f18f83e63e36cd1bfcff3ee7330acc2d2eb3b740e"} Mar 19 09:51:33.016530 master-0 kubenswrapper[15202]: I0319 09:51:33.004360 15202 scope.go:117] "RemoveContainer" containerID="c734fd4d54f1b4c7366360129b3be32cf4dc3eb52370ac898b6412882342c45c" Mar 19 09:51:33.016530 master-0 kubenswrapper[15202]: I0319 09:51:33.004463 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.038348 master-0 kubenswrapper[15202]: I0319 09:51:33.026031 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9939df7e-1cba-4d74-95d7-524376a36627" (UID: "9939df7e-1cba-4d74-95d7-524376a36627"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:33.038348 master-0 kubenswrapper[15202]: I0319 09:51:33.037978 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:33.090836 master-0 kubenswrapper[15202]: I0319 09:51:33.090044 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"9939df7e-1cba-4d74-95d7-524376a36627","Type":"ContainerDied","Data":"bf3ffd7f07ae452dfcf4688538269ac19f8f412f9044ed366c9155a28678d2f9"} Mar 19 09:51:33.108344 master-0 kubenswrapper[15202]: I0319 09:51:33.107267 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.162844 master-0 kubenswrapper[15202]: I0319 09:51:33.144812 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:33.189306 master-0 kubenswrapper[15202]: I0319 09:51:33.189216 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data" (OuterVolumeSpecName: "config-data") pod "9939df7e-1cba-4d74-95d7-524376a36627" (UID: "9939df7e-1cba-4d74-95d7-524376a36627"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:33.238417 master-0 kubenswrapper[15202]: I0319 09:51:33.237825 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:33.258250 master-0 kubenswrapper[15202]: I0319 09:51:33.257604 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:33.258250 master-0 kubenswrapper[15202]: I0319 09:51:33.258085 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9939df7e-1cba-4d74-95d7-524376a36627-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:33.258354 master-0 kubenswrapper[15202]: E0319 09:51:33.258280 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="cinder-backup" Mar 19 09:51:33.258354 master-0 kubenswrapper[15202]: I0319 09:51:33.258302 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="cinder-backup" Mar 19 09:51:33.258354 master-0 kubenswrapper[15202]: E0319 09:51:33.258352 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="probe" Mar 19 09:51:33.258447 master-0 kubenswrapper[15202]: I0319 09:51:33.258361 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="probe" Mar 19 09:51:33.258447 master-0 kubenswrapper[15202]: E0319 09:51:33.258396 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="cinder-scheduler" Mar 19 09:51:33.258447 master-0 kubenswrapper[15202]: I0319 09:51:33.258406 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="cinder-scheduler" Mar 19 09:51:33.258447 master-0 kubenswrapper[15202]: E0319 09:51:33.258440 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="probe" Mar 19 09:51:33.258447 master-0 kubenswrapper[15202]: I0319 09:51:33.258448 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="probe" Mar 19 09:51:33.258764 master-0 kubenswrapper[15202]: I0319 09:51:33.258730 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="probe" Mar 19 09:51:33.258807 master-0 kubenswrapper[15202]: I0319 09:51:33.258772 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="cinder-scheduler" Mar 19 09:51:33.258807 master-0 kubenswrapper[15202]: I0319 09:51:33.258797 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9939df7e-1cba-4d74-95d7-524376a36627" containerName="probe" Mar 19 09:51:33.258879 master-0 kubenswrapper[15202]: I0319 09:51:33.258846 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" containerName="cinder-backup" Mar 19 09:51:33.260444 master-0 kubenswrapper[15202]: I0319 09:51:33.260412 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.271261 master-0 kubenswrapper[15202]: I0319 09:51:33.271118 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-backup-config-data" Mar 19 09:51:33.296070 master-0 kubenswrapper[15202]: I0319 09:51:33.295705 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:33.302171 master-0 kubenswrapper[15202]: W0319 09:51:33.302117 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8a687b2_1e3a_4510_af7c_34277b366455.slice/crio-b9a2bedd56a96e9a20bdc0aa54c5836e1df8f3b26a144076ffb68fc2abfc457e WatchSource:0}: Error finding container b9a2bedd56a96e9a20bdc0aa54c5836e1df8f3b26a144076ffb68fc2abfc457e: Status 404 returned error can't find the container with id b9a2bedd56a96e9a20bdc0aa54c5836e1df8f3b26a144076ffb68fc2abfc457e Mar 19 09:51:33.316871 master-0 kubenswrapper[15202]: I0319 09:51:33.316800 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-volume-lvm-iscsi-0"] Mar 19 09:51:33.324851 master-0 kubenswrapper[15202]: I0319 09:51:33.322648 15202 scope.go:117] "RemoveContainer" containerID="7bc8077f80a1a873040b36173eedf2d6f180726ba0a067e4a115c34e9692097e" Mar 19 09:51:33.361753 master-0 kubenswrapper[15202]: I0319 09:51:33.361610 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-sys\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.361753 master-0 kubenswrapper[15202]: I0319 09:51:33.361770 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-scripts\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.361753 master-0 kubenswrapper[15202]: I0319 09:51:33.361792 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mgp\" (UniqueName: \"kubernetes.io/projected/eca23393-b469-4d29-bb25-0b1edae5d066-kube-api-access-x9mgp\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.361753 master-0 kubenswrapper[15202]: I0319 09:51:33.361834 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-combined-ca-bundle\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.362584 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-locks-brick\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.362653 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-nvme\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.362687 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-lib-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.362793 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-locks-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.362916 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-run\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.362952 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-iscsi\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.363031 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-lib-modules\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.363080 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-machine-id\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.363134 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-config-data\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.363284 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-dev\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.363705 master-0 kubenswrapper[15202]: I0319 09:51:33.363305 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-config-data-custom\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.442253 master-0 kubenswrapper[15202]: I0319 09:51:33.442150 15202 scope.go:117] "RemoveContainer" containerID="9f75f7a49f0c64e8251eca0e1af7f9568a28cfb22959ddd5127b289d83248fb2" Mar 19 09:51:33.463579 master-0 kubenswrapper[15202]: I0319 09:51:33.463443 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:33.465219 master-0 kubenswrapper[15202]: I0319 09:51:33.465180 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-scripts\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466203 master-0 kubenswrapper[15202]: I0319 09:51:33.466171 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mgp\" (UniqueName: \"kubernetes.io/projected/eca23393-b469-4d29-bb25-0b1edae5d066-kube-api-access-x9mgp\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466347 master-0 kubenswrapper[15202]: I0319 09:51:33.466329 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-combined-ca-bundle\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466516 master-0 kubenswrapper[15202]: I0319 09:51:33.466497 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-locks-brick\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466614 master-0 kubenswrapper[15202]: I0319 09:51:33.466600 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-lib-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466703 master-0 kubenswrapper[15202]: I0319 09:51:33.466689 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-nvme\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466830 master-0 kubenswrapper[15202]: I0319 09:51:33.466815 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-locks-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.466955 master-0 kubenswrapper[15202]: I0319 09:51:33.466942 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-run\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.467042 master-0 kubenswrapper[15202]: I0319 09:51:33.467028 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-iscsi\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.467202 master-0 kubenswrapper[15202]: I0319 09:51:33.467188 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-lib-modules\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.467288 master-0 kubenswrapper[15202]: I0319 09:51:33.467276 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-machine-id\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.467394 master-0 kubenswrapper[15202]: I0319 09:51:33.467381 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-config-data\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.467798 master-0 kubenswrapper[15202]: I0319 09:51:33.467712 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-dev\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.468429 master-0 kubenswrapper[15202]: I0319 09:51:33.467879 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-config-data-custom\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.468631 master-0 kubenswrapper[15202]: I0319 09:51:33.468615 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-sys\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.468831 master-0 kubenswrapper[15202]: I0319 09:51:33.468817 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-sys\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.469570 master-0 kubenswrapper[15202]: I0319 09:51:33.469508 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-run\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.470818 master-0 kubenswrapper[15202]: I0319 09:51:33.470765 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-machine-id\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.470903 master-0 kubenswrapper[15202]: I0319 09:51:33.470856 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-iscsi\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.470903 master-0 kubenswrapper[15202]: I0319 09:51:33.470897 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-lib-modules\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.470994 master-0 kubenswrapper[15202]: I0319 09:51:33.470956 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-lib-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.470994 master-0 kubenswrapper[15202]: I0319 09:51:33.470992 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-locks-brick\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.471176 master-0 kubenswrapper[15202]: I0319 09:51:33.471031 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-etc-nvme\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.471176 master-0 kubenswrapper[15202]: I0319 09:51:33.471064 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-var-locks-cinder\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.471176 master-0 kubenswrapper[15202]: I0319 09:51:33.471091 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eca23393-b469-4d29-bb25-0b1edae5d066-dev\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.475186 master-0 kubenswrapper[15202]: I0319 09:51:33.475098 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-scripts\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.477064 master-0 kubenswrapper[15202]: I0319 09:51:33.476787 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-combined-ca-bundle\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.477064 master-0 kubenswrapper[15202]: I0319 09:51:33.476859 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:33.481502 master-0 kubenswrapper[15202]: I0319 09:51:33.481387 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-config-data-custom\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.491239 master-0 kubenswrapper[15202]: I0319 09:51:33.491189 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eca23393-b469-4d29-bb25-0b1edae5d066-config-data\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.510692 master-0 kubenswrapper[15202]: I0319 09:51:33.510621 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mgp\" (UniqueName: \"kubernetes.io/projected/eca23393-b469-4d29-bb25-0b1edae5d066-kube-api-access-x9mgp\") pod \"cinder-7ba05-backup-0\" (UID: \"eca23393-b469-4d29-bb25-0b1edae5d066\") " pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.526380 master-0 kubenswrapper[15202]: I0319 09:51:33.525851 15202 scope.go:117] "RemoveContainer" containerID="2e4b8a0163a3376a870446b23c716df8eff0060d7686a913ba754944ccbb4e0e" Mar 19 09:51:33.539708 master-0 kubenswrapper[15202]: I0319 09:51:33.539644 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:33.543169 master-0 kubenswrapper[15202]: I0319 09:51:33.543127 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.545539 master-0 kubenswrapper[15202]: I0319 09:51:33.545281 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7ba05-scheduler-config-data" Mar 19 09:51:33.574036 master-0 kubenswrapper[15202]: I0319 09:51:33.573342 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-config-data\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.574036 master-0 kubenswrapper[15202]: I0319 09:51:33.573433 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srp4j\" (UniqueName: \"kubernetes.io/projected/45f14454-38fd-4f69-81b6-8d66033b21d4-kube-api-access-srp4j\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.574036 master-0 kubenswrapper[15202]: I0319 09:51:33.573724 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-config-data-custom\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.574036 master-0 kubenswrapper[15202]: I0319 09:51:33.573822 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-scripts\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.574036 master-0 kubenswrapper[15202]: I0319 09:51:33.573891 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-combined-ca-bundle\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.574036 master-0 kubenswrapper[15202]: I0319 09:51:33.573969 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45f14454-38fd-4f69-81b6-8d66033b21d4-etc-machine-id\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.582279 master-0 kubenswrapper[15202]: I0319 09:51:33.582174 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:33.636600 master-0 kubenswrapper[15202]: W0319 09:51:33.636509 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d49e40e_907d_47f6_b07e_1cf72ec3f3a9.slice/crio-4dbc8fc8f7cf66a5fc9653b9985e975f615f6ea391e68e7f2c9fa93730d06910 WatchSource:0}: Error finding container 4dbc8fc8f7cf66a5fc9653b9985e975f615f6ea391e68e7f2c9fa93730d06910: Status 404 returned error can't find the container with id 4dbc8fc8f7cf66a5fc9653b9985e975f615f6ea391e68e7f2c9fa93730d06910 Mar 19 09:51:33.649920 master-0 kubenswrapper[15202]: I0319 09:51:33.649816 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-x7j8z"] Mar 19 09:51:33.684866 master-0 kubenswrapper[15202]: I0319 09:51:33.683312 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45f14454-38fd-4f69-81b6-8d66033b21d4-etc-machine-id\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.684866 master-0 kubenswrapper[15202]: I0319 09:51:33.683451 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-config-data\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.684866 master-0 kubenswrapper[15202]: I0319 09:51:33.683540 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srp4j\" (UniqueName: \"kubernetes.io/projected/45f14454-38fd-4f69-81b6-8d66033b21d4-kube-api-access-srp4j\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.684866 master-0 kubenswrapper[15202]: I0319 09:51:33.683669 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-config-data-custom\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.684866 master-0 kubenswrapper[15202]: I0319 09:51:33.683838 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-scripts\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.684866 master-0 kubenswrapper[15202]: I0319 09:51:33.683950 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-combined-ca-bundle\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.685436 master-0 kubenswrapper[15202]: I0319 09:51:33.685234 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45f14454-38fd-4f69-81b6-8d66033b21d4-etc-machine-id\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.707889 master-0 kubenswrapper[15202]: I0319 09:51:33.694636 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-config-data\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.711426 master-0 kubenswrapper[15202]: I0319 09:51:33.710918 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-scripts\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.711990 master-0 kubenswrapper[15202]: I0319 09:51:33.711893 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-config-data-custom\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.716133 master-0 kubenswrapper[15202]: I0319 09:51:33.716084 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f14454-38fd-4f69-81b6-8d66033b21d4-combined-ca-bundle\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.730208 master-0 kubenswrapper[15202]: I0319 09:51:33.729941 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:33.763756 master-0 kubenswrapper[15202]: I0319 09:51:33.762209 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srp4j\" (UniqueName: \"kubernetes.io/projected/45f14454-38fd-4f69-81b6-8d66033b21d4-kube-api-access-srp4j\") pod \"cinder-7ba05-scheduler-0\" (UID: \"45f14454-38fd-4f69-81b6-8d66033b21d4\") " pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:33.900872 master-0 kubenswrapper[15202]: I0319 09:51:33.900474 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:34.163315 master-0 kubenswrapper[15202]: I0319 09:51:34.163130 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" event={"ID":"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9","Type":"ContainerStarted","Data":"6350fb831fcd1852ae42fd0cac2ad4b82126f840a2f7c9cf75aed041b345afe0"} Mar 19 09:51:34.163315 master-0 kubenswrapper[15202]: I0319 09:51:34.163195 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" event={"ID":"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9","Type":"ContainerStarted","Data":"4dbc8fc8f7cf66a5fc9653b9985e975f615f6ea391e68e7f2c9fa93730d06910"} Mar 19 09:51:34.170579 master-0 kubenswrapper[15202]: I0319 09:51:34.170404 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"c8a687b2-1e3a-4510-af7c-34277b366455","Type":"ContainerStarted","Data":"e1a92a6a2b6da944618db0b55544fb619449ffd877913a119532214fdc1ed982"} Mar 19 09:51:34.170579 master-0 kubenswrapper[15202]: I0319 09:51:34.170458 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"c8a687b2-1e3a-4510-af7c-34277b366455","Type":"ContainerStarted","Data":"c763f6d711cfa577afb89014244d3b0b25ced853df5413faf586b23fe5deba18"} Mar 19 09:51:34.170579 master-0 kubenswrapper[15202]: I0319 09:51:34.170510 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" event={"ID":"c8a687b2-1e3a-4510-af7c-34277b366455","Type":"ContainerStarted","Data":"b9a2bedd56a96e9a20bdc0aa54c5836e1df8f3b26a144076ffb68fc2abfc457e"} Mar 19 09:51:34.249234 master-0 kubenswrapper[15202]: I0319 09:51:34.249062 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" podStartSLOduration=3.249035325 podStartE2EDuration="3.249035325s" podCreationTimestamp="2026-03-19 09:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:34.243724143 +0000 UTC m=+1611.629138959" watchObservedRunningTime="2026-03-19 09:51:34.249035325 +0000 UTC m=+1611.634450141" Mar 19 09:51:34.456454 master-0 kubenswrapper[15202]: I0319 09:51:34.453820 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-backup-0"] Mar 19 09:51:34.603276 master-0 kubenswrapper[15202]: I0319 09:51:34.603141 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ba05-scheduler-0"] Mar 19 09:51:34.849900 master-0 kubenswrapper[15202]: I0319 09:51:34.849815 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c324ae-8b95-430e-b544-90e2a4b5ff33" path="/var/lib/kubelet/pods/19c324ae-8b95-430e-b544-90e2a4b5ff33/volumes" Mar 19 09:51:34.850690 master-0 kubenswrapper[15202]: I0319 09:51:34.850656 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9939df7e-1cba-4d74-95d7-524376a36627" path="/var/lib/kubelet/pods/9939df7e-1cba-4d74-95d7-524376a36627/volumes" Mar 19 09:51:35.263204 master-0 kubenswrapper[15202]: I0319 09:51:35.262950 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"45f14454-38fd-4f69-81b6-8d66033b21d4","Type":"ContainerStarted","Data":"668cc77b9e22ed922967f540c38ca1af4039de80ed43784af8bb3e5b66ca3f07"} Mar 19 09:51:35.273663 master-0 kubenswrapper[15202]: I0319 09:51:35.272847 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"eca23393-b469-4d29-bb25-0b1edae5d066","Type":"ContainerStarted","Data":"26682f957764ad8222391377efa6deba2a75ab73dcc0f0601510a001b8c92bff"} Mar 19 09:51:35.273663 master-0 kubenswrapper[15202]: I0319 09:51:35.272905 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"eca23393-b469-4d29-bb25-0b1edae5d066","Type":"ContainerStarted","Data":"18eb216bfd3310ceb5339cf16bab4cc49722ede9da4aa9a64d3f71d9d8e7a7cc"} Mar 19 09:51:35.862541 master-0 kubenswrapper[15202]: I0319 09:51:35.861364 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cb6bf676c-xlvsw" podUID="76d834e7-4d24-4e34-8ebd-b71c80766e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.217:5353: i/o timeout" Mar 19 09:51:36.318556 master-0 kubenswrapper[15202]: I0319 09:51:36.318370 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-backup-0" event={"ID":"eca23393-b469-4d29-bb25-0b1edae5d066","Type":"ContainerStarted","Data":"74501d3049a4736361cbc420c18a6a0c830bd5d5b4f463412e058d7f11d381be"} Mar 19 09:51:36.319969 master-0 kubenswrapper[15202]: I0319 09:51:36.319918 15202 generic.go:334] "Generic (PLEG): container finished" podID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerID="b351eff4c20bb908b338fde6cdda00d87e674e8f9831fe42719e7a444cd88df8" exitCode=0 Mar 19 09:51:36.319969 master-0 kubenswrapper[15202]: I0319 09:51:36.319969 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" event={"ID":"18d2318b-1a1b-45a3-9b06-ea750daf9e17","Type":"ContainerDied","Data":"b351eff4c20bb908b338fde6cdda00d87e674e8f9831fe42719e7a444cd88df8"} Mar 19 09:51:36.347662 master-0 kubenswrapper[15202]: I0319 09:51:36.347570 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"45f14454-38fd-4f69-81b6-8d66033b21d4","Type":"ContainerStarted","Data":"276f98c8c0a1a4c5b653e2ba57bf751b620056cb0147b5f4a2fd212b717dadd4"} Mar 19 09:51:37.509603 master-0 kubenswrapper[15202]: I0319 09:51:37.509424 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:38.329602 master-0 kubenswrapper[15202]: I0319 09:51:38.329453 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-backup-0" podStartSLOduration=5.329419823 podStartE2EDuration="5.329419823s" podCreationTimestamp="2026-03-19 09:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:38.278011436 +0000 UTC m=+1615.663426252" watchObservedRunningTime="2026-03-19 09:51:38.329419823 +0000 UTC m=+1615.714834639" Mar 19 09:51:38.378965 master-0 kubenswrapper[15202]: I0319 09:51:38.378876 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ba05-scheduler-0" event={"ID":"45f14454-38fd-4f69-81b6-8d66033b21d4","Type":"ContainerStarted","Data":"6588e5f87ff5ac7e3f097433fc9ea7a6322f970eac544ad7c8b23fd053499081"} Mar 19 09:51:38.730431 master-0 kubenswrapper[15202]: I0319 09:51:38.730313 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:38.902283 master-0 kubenswrapper[15202]: I0319 09:51:38.902167 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:39.117098 master-0 kubenswrapper[15202]: I0319 09:51:39.116945 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ba05-scheduler-0" podStartSLOduration=6.116923735 podStartE2EDuration="6.116923735s" podCreationTimestamp="2026-03-19 09:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:39.106243521 +0000 UTC m=+1616.491658337" watchObservedRunningTime="2026-03-19 09:51:39.116923735 +0000 UTC m=+1616.502338551" Mar 19 09:51:40.293133 master-0 kubenswrapper[15202]: I0319 09:51:40.293057 15202 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-7ba05-api-0" podUID="29cc00b0-1537-42ce-b8ce-918dea958cf9" containerName="cinder-api" probeResult="failure" output="Get \"https://10.128.0.235:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:40.410504 master-0 kubenswrapper[15202]: I0319 09:51:40.410427 15202 generic.go:334] "Generic (PLEG): container finished" podID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerID="6350fb831fcd1852ae42fd0cac2ad4b82126f840a2f7c9cf75aed041b345afe0" exitCode=0 Mar 19 09:51:40.410859 master-0 kubenswrapper[15202]: I0319 09:51:40.410545 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" event={"ID":"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9","Type":"ContainerDied","Data":"6350fb831fcd1852ae42fd0cac2ad4b82126f840a2f7c9cf75aed041b345afe0"} Mar 19 09:51:40.529660 master-0 kubenswrapper[15202]: I0319 09:51:40.527150 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-85f97d8d64-dfwgh" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.128.0.226:9696/\": dial tcp 10.128.0.226:9696: connect: connection refused" Mar 19 09:51:41.278333 master-0 kubenswrapper[15202]: I0319 09:51:41.278236 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-7ba05-api-0" podUID="29cc00b0-1537-42ce-b8ce-918dea958cf9" containerName="cinder-api" probeResult="failure" output="Get \"https://10.128.0.235:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:41.451516 master-0 kubenswrapper[15202]: I0319 09:51:41.447978 15202 generic.go:334] "Generic (PLEG): container finished" podID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerID="f6ac93c4bee5849db982df9b62b9dcdfee2899a896588d3b6aa0b6351cc21e33" exitCode=0 Mar 19 09:51:41.451516 master-0 kubenswrapper[15202]: I0319 09:51:41.448039 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" event={"ID":"18d2318b-1a1b-45a3-9b06-ea750daf9e17","Type":"ContainerDied","Data":"f6ac93c4bee5849db982df9b62b9dcdfee2899a896588d3b6aa0b6351cc21e33"} Mar 19 09:51:41.511575 master-0 kubenswrapper[15202]: I0319 09:51:41.510606 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:41.513781 master-0 kubenswrapper[15202]: I0319 09:51:41.513692 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:51:41.919192 master-0 kubenswrapper[15202]: I0319 09:51:41.914762 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67c9b9475d-ksb2w"] Mar 19 09:51:41.929292 master-0 kubenswrapper[15202]: I0319 09:51:41.927052 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.940150 master-0 kubenswrapper[15202]: I0319 09:51:41.939685 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:41.948631 master-0 kubenswrapper[15202]: I0319 09:51:41.943781 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67c9b9475d-ksb2w"] Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.978630 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nnml\" (UniqueName: \"kubernetes.io/projected/7328616d-ec33-44bc-a0ca-aad7c3ca650e-kube-api-access-6nnml\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.978838 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-config-data\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.978867 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7328616d-ec33-44bc-a0ca-aad7c3ca650e-logs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.978910 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-internal-tls-certs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.978965 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-scripts\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.978983 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-public-tls-certs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:41.980495 master-0 kubenswrapper[15202]: I0319 09:51:41.979001 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-combined-ca-bundle\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.085390 master-0 kubenswrapper[15202]: I0319 09:51:42.083082 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-internal-tls-certs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.088610 master-0 kubenswrapper[15202]: I0319 09:51:42.087173 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-scripts\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.088610 master-0 kubenswrapper[15202]: I0319 09:51:42.087240 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-public-tls-certs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.088610 master-0 kubenswrapper[15202]: I0319 09:51:42.087303 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-combined-ca-bundle\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.088610 master-0 kubenswrapper[15202]: I0319 09:51:42.087395 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nnml\" (UniqueName: \"kubernetes.io/projected/7328616d-ec33-44bc-a0ca-aad7c3ca650e-kube-api-access-6nnml\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.088610 master-0 kubenswrapper[15202]: I0319 09:51:42.088031 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-config-data\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.088610 master-0 kubenswrapper[15202]: I0319 09:51:42.088093 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7328616d-ec33-44bc-a0ca-aad7c3ca650e-logs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.098149 master-0 kubenswrapper[15202]: I0319 09:51:42.098090 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-internal-tls-certs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.104634 master-0 kubenswrapper[15202]: I0319 09:51:42.104035 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7328616d-ec33-44bc-a0ca-aad7c3ca650e-logs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.106382 master-0 kubenswrapper[15202]: I0319 09:51:42.106336 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-scripts\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.108206 master-0 kubenswrapper[15202]: I0319 09:51:42.108157 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-public-tls-certs\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.120958 master-0 kubenswrapper[15202]: I0319 09:51:42.116541 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-combined-ca-bundle\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.155032 master-0 kubenswrapper[15202]: I0319 09:51:42.117575 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7328616d-ec33-44bc-a0ca-aad7c3ca650e-config-data\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.155032 master-0 kubenswrapper[15202]: I0319 09:51:42.148054 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nnml\" (UniqueName: \"kubernetes.io/projected/7328616d-ec33-44bc-a0ca-aad7c3ca650e-kube-api-access-6nnml\") pod \"placement-67c9b9475d-ksb2w\" (UID: \"7328616d-ec33-44bc-a0ca-aad7c3ca650e\") " pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.292078 master-0 kubenswrapper[15202]: I0319 09:51:42.291988 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:42.490396 master-0 kubenswrapper[15202]: I0319 09:51:42.490321 15202 generic.go:334] "Generic (PLEG): container finished" podID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerID="ebf23344f287355631505c388ebb7641a50e86eb695575058db26a33e7064584" exitCode=0 Mar 19 09:51:42.490753 master-0 kubenswrapper[15202]: I0319 09:51:42.490594 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" event={"ID":"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9","Type":"ContainerDied","Data":"ebf23344f287355631505c388ebb7641a50e86eb695575058db26a33e7064584"} Mar 19 09:51:43.142981 master-0 kubenswrapper[15202]: I0319 09:51:43.142412 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67c9b9475d-ksb2w"] Mar 19 09:51:43.263219 master-0 kubenswrapper[15202]: I0319 09:51:43.256451 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7ba05-volume-lvm-iscsi-0" Mar 19 09:51:43.448843 master-0 kubenswrapper[15202]: I0319 09:51:43.448784 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:43.469717 master-0 kubenswrapper[15202]: I0319 09:51:43.469150 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/18d2318b-1a1b-45a3-9b06-ea750daf9e17-image-data\") pod \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " Mar 19 09:51:43.469854 master-0 kubenswrapper[15202]: I0319 09:51:43.469729 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v5bq\" (UniqueName: \"kubernetes.io/projected/18d2318b-1a1b-45a3-9b06-ea750daf9e17-kube-api-access-5v5bq\") pod \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\" (UID: \"18d2318b-1a1b-45a3-9b06-ea750daf9e17\") " Mar 19 09:51:43.500007 master-0 kubenswrapper[15202]: I0319 09:51:43.498779 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d2318b-1a1b-45a3-9b06-ea750daf9e17-kube-api-access-5v5bq" (OuterVolumeSpecName: "kube-api-access-5v5bq") pod "18d2318b-1a1b-45a3-9b06-ea750daf9e17" (UID: "18d2318b-1a1b-45a3-9b06-ea750daf9e17"). InnerVolumeSpecName "kube-api-access-5v5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:43.522306 master-0 kubenswrapper[15202]: I0319 09:51:43.522241 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c9b9475d-ksb2w" event={"ID":"7328616d-ec33-44bc-a0ca-aad7c3ca650e","Type":"ContainerStarted","Data":"fe93c7313218f275c64939b62e3fb630ebcbc1a906ff53d6ee90884565d45eae"} Mar 19 09:51:43.524378 master-0 kubenswrapper[15202]: I0319 09:51:43.524355 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" Mar 19 09:51:43.525287 master-0 kubenswrapper[15202]: I0319 09:51:43.525249 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-lfsjb" event={"ID":"18d2318b-1a1b-45a3-9b06-ea750daf9e17","Type":"ContainerDied","Data":"f478d50c419fe4e245be20df9ad42de75814761db2769d84704dfa5a770731ba"} Mar 19 09:51:43.525287 master-0 kubenswrapper[15202]: I0319 09:51:43.525285 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f478d50c419fe4e245be20df9ad42de75814761db2769d84704dfa5a770731ba" Mar 19 09:51:43.575966 master-0 kubenswrapper[15202]: I0319 09:51:43.574571 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v5bq\" (UniqueName: \"kubernetes.io/projected/18d2318b-1a1b-45a3-9b06-ea750daf9e17-kube-api-access-5v5bq\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:43.680070 master-0 kubenswrapper[15202]: I0319 09:51:43.679894 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d2318b-1a1b-45a3-9b06-ea750daf9e17-image-data" (OuterVolumeSpecName: "image-data") pod "18d2318b-1a1b-45a3-9b06-ea750daf9e17" (UID: "18d2318b-1a1b-45a3-9b06-ea750daf9e17"). InnerVolumeSpecName "image-data". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:43.781115 master-0 kubenswrapper[15202]: I0319 09:51:43.781036 15202 reconciler_common.go:293] "Volume detached for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/18d2318b-1a1b-45a3-9b06-ea750daf9e17-image-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:44.035763 master-0 kubenswrapper[15202]: I0319 09:51:44.035698 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:44.104533 master-0 kubenswrapper[15202]: I0319 09:51:44.103969 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tptlr\" (UniqueName: \"kubernetes.io/projected/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-kube-api-access-tptlr\") pod \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " Mar 19 09:51:44.104533 master-0 kubenswrapper[15202]: I0319 09:51:44.104203 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-image-data\") pod \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\" (UID: \"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9\") " Mar 19 09:51:44.166654 master-0 kubenswrapper[15202]: I0319 09:51:44.159980 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-kube-api-access-tptlr" (OuterVolumeSpecName: "kube-api-access-tptlr") pod "8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" (UID: "8d49e40e-907d-47f6-b07e-1cf72ec3f3a9"). InnerVolumeSpecName "kube-api-access-tptlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:44.197206 master-0 kubenswrapper[15202]: I0319 09:51:44.192395 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7ba05-backup-0" Mar 19 09:51:44.283200 master-0 kubenswrapper[15202]: I0319 09:51:44.272915 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tptlr\" (UniqueName: \"kubernetes.io/projected/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-kube-api-access-tptlr\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:44.365293 master-0 kubenswrapper[15202]: I0319 09:51:44.365224 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7ba05-scheduler-0" Mar 19 09:51:44.436919 master-0 kubenswrapper[15202]: I0319 09:51:44.435753 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-image-data" (OuterVolumeSpecName: "image-data") pod "8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" (UID: "8d49e40e-907d-47f6-b07e-1cf72ec3f3a9"). InnerVolumeSpecName "image-data". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:44.479181 master-0 kubenswrapper[15202]: I0319 09:51:44.479107 15202 reconciler_common.go:293] "Volume detached for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9-image-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:44.555942 master-0 kubenswrapper[15202]: I0319 09:51:44.555857 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c9b9475d-ksb2w" event={"ID":"7328616d-ec33-44bc-a0ca-aad7c3ca650e","Type":"ContainerStarted","Data":"f2ab03537a968df3fc3b9ef299f9aecfacfc6f5482393b23e6cb895c20e69237"} Mar 19 09:51:44.556510 master-0 kubenswrapper[15202]: I0319 09:51:44.555961 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67c9b9475d-ksb2w" event={"ID":"7328616d-ec33-44bc-a0ca-aad7c3ca650e","Type":"ContainerStarted","Data":"60d293e73422247b43f39615d29e3e60ef71197459918def6a67a87b28cc054d"} Mar 19 09:51:44.556510 master-0 kubenswrapper[15202]: I0319 09:51:44.555988 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:44.557392 master-0 kubenswrapper[15202]: I0319 09:51:44.557282 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:51:44.561763 master-0 kubenswrapper[15202]: I0319 09:51:44.561714 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" event={"ID":"8d49e40e-907d-47f6-b07e-1cf72ec3f3a9","Type":"ContainerDied","Data":"4dbc8fc8f7cf66a5fc9653b9985e975f615f6ea391e68e7f2c9fa93730d06910"} Mar 19 09:51:44.563747 master-0 kubenswrapper[15202]: I0319 09:51:44.563726 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbc8fc8f7cf66a5fc9653b9985e975f615f6ea391e68e7f2c9fa93730d06910" Mar 19 09:51:44.563860 master-0 kubenswrapper[15202]: I0319 09:51:44.562315 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-x7j8z" Mar 19 09:51:44.675984 master-0 kubenswrapper[15202]: I0319 09:51:44.675842 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67c9b9475d-ksb2w" podStartSLOduration=3.675799907 podStartE2EDuration="3.675799907s" podCreationTimestamp="2026-03-19 09:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:44.588460945 +0000 UTC m=+1621.973875771" watchObservedRunningTime="2026-03-19 09:51:44.675799907 +0000 UTC m=+1622.061214723" Mar 19 09:51:44.748657 master-0 kubenswrapper[15202]: I0319 09:51:44.746556 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b44d66bc9-5zxbb" Mar 19 09:51:45.092458 master-0 kubenswrapper[15202]: I0319 09:51:45.092350 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-7ba05-api-0" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.062170 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: E0319 09:51:48.062974 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerName="edpm-a-provisionserver-checksum-discovery" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.062995 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerName="edpm-a-provisionserver-checksum-discovery" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: E0319 09:51:48.063027 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerName="init" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.063037 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerName="init" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: E0319 09:51:48.063078 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerName="edpm-b-provisionserver-checksum-discovery" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.063088 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerName="edpm-b-provisionserver-checksum-discovery" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: E0319 09:51:48.063105 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerName="init" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.063113 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerName="init" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.063443 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" containerName="edpm-b-provisionserver-checksum-discovery" Mar 19 09:51:48.064608 master-0 kubenswrapper[15202]: I0319 09:51:48.063504 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" containerName="edpm-a-provisionserver-checksum-discovery" Mar 19 09:51:48.065377 master-0 kubenswrapper[15202]: I0319 09:51:48.064701 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:51:48.069015 master-0 kubenswrapper[15202]: I0319 09:51:48.068546 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 09:51:48.069015 master-0 kubenswrapper[15202]: I0319 09:51:48.068586 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 09:51:48.100089 master-0 kubenswrapper[15202]: I0319 09:51:48.077086 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:48.256510 master-0 kubenswrapper[15202]: I0319 09:51:48.255255 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh9ft\" (UniqueName: \"kubernetes.io/projected/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-kube-api-access-gh9ft\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.256510 master-0 kubenswrapper[15202]: I0319 09:51:48.255349 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.256510 master-0 kubenswrapper[15202]: I0319 09:51:48.255642 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.256510 master-0 kubenswrapper[15202]: I0319 09:51:48.255722 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.274679 master-0 kubenswrapper[15202]: I0319 09:51:48.274624 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:48.275961 master-0 kubenswrapper[15202]: E0319 09:51:48.275934 15202 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-gh9ft openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="7c0cf75d-1106-4b10-9d2d-c0238d30cd70" Mar 19 09:51:48.302338 master-0 kubenswrapper[15202]: I0319 09:51:48.302275 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:48.323620 master-0 kubenswrapper[15202]: I0319 09:51:48.323570 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:48.325439 master-0 kubenswrapper[15202]: I0319 09:51:48.325416 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:51:48.340324 master-0 kubenswrapper[15202]: I0319 09:51:48.340259 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:48.357422 master-0 kubenswrapper[15202]: I0319 09:51:48.357283 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.357422 master-0 kubenswrapper[15202]: I0319 09:51:48.357344 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.357702 master-0 kubenswrapper[15202]: I0319 09:51:48.357484 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh9ft\" (UniqueName: \"kubernetes.io/projected/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-kube-api-access-gh9ft\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.357702 master-0 kubenswrapper[15202]: I0319 09:51:48.357536 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.360116 master-0 kubenswrapper[15202]: I0319 09:51:48.359653 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.365320 master-0 kubenswrapper[15202]: E0319 09:51:48.365094 15202 projected.go:194] Error preparing data for projected volume kube-api-access-gh9ft for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (7c0cf75d-1106-4b10-9d2d-c0238d30cd70) does not match the UID in record. The object might have been deleted and then recreated Mar 19 09:51:48.365320 master-0 kubenswrapper[15202]: E0319 09:51:48.365241 15202 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-kube-api-access-gh9ft podName:7c0cf75d-1106-4b10-9d2d-c0238d30cd70 nodeName:}" failed. No retries permitted until 2026-03-19 09:51:48.865205808 +0000 UTC m=+1626.250620624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gh9ft" (UniqueName: "kubernetes.io/projected/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-kube-api-access-gh9ft") pod "openstackclient" (UID: "7c0cf75d-1106-4b10-9d2d-c0238d30cd70") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (7c0cf75d-1106-4b10-9d2d-c0238d30cd70) does not match the UID in record. The object might have been deleted and then recreated Mar 19 09:51:48.368597 master-0 kubenswrapper[15202]: I0319 09:51:48.368554 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.372081 master-0 kubenswrapper[15202]: I0319 09:51:48.372041 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config-secret\") pod \"openstackclient\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " pod="openstack/openstackclient" Mar 19 09:51:48.459988 master-0 kubenswrapper[15202]: I0319 09:51:48.459947 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7eb8479a-645c-40f7-852f-1b0fb72fa067-openstack-config\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.460426 master-0 kubenswrapper[15202]: I0319 09:51:48.460409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb8479a-645c-40f7-852f-1b0fb72fa067-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.460672 master-0 kubenswrapper[15202]: I0319 09:51:48.460640 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27bgp\" (UniqueName: \"kubernetes.io/projected/7eb8479a-645c-40f7-852f-1b0fb72fa067-kube-api-access-27bgp\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.460801 master-0 kubenswrapper[15202]: I0319 09:51:48.460785 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7eb8479a-645c-40f7-852f-1b0fb72fa067-openstack-config-secret\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.467969 master-0 kubenswrapper[15202]: I0319 09:51:48.467919 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85f97d8d64-dfwgh_bde4d125-5422-48a5-809b-b7326315062c/neutron-api/0.log" Mar 19 09:51:48.468203 master-0 kubenswrapper[15202]: I0319 09:51:48.468022 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:48.565374 master-0 kubenswrapper[15202]: I0319 09:51:48.565302 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-httpd-config\") pod \"bde4d125-5422-48a5-809b-b7326315062c\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " Mar 19 09:51:48.565661 master-0 kubenswrapper[15202]: I0319 09:51:48.565446 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-combined-ca-bundle\") pod \"bde4d125-5422-48a5-809b-b7326315062c\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " Mar 19 09:51:48.565661 master-0 kubenswrapper[15202]: I0319 09:51:48.565557 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-config\") pod \"bde4d125-5422-48a5-809b-b7326315062c\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " Mar 19 09:51:48.565661 master-0 kubenswrapper[15202]: I0319 09:51:48.565602 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsmv9\" (UniqueName: \"kubernetes.io/projected/bde4d125-5422-48a5-809b-b7326315062c-kube-api-access-gsmv9\") pod \"bde4d125-5422-48a5-809b-b7326315062c\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " Mar 19 09:51:48.565661 master-0 kubenswrapper[15202]: I0319 09:51:48.565660 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-ovndb-tls-certs\") pod \"bde4d125-5422-48a5-809b-b7326315062c\" (UID: \"bde4d125-5422-48a5-809b-b7326315062c\") " Mar 19 09:51:48.567683 master-0 kubenswrapper[15202]: I0319 09:51:48.567526 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb8479a-645c-40f7-852f-1b0fb72fa067-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.567965 master-0 kubenswrapper[15202]: I0319 09:51:48.567911 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27bgp\" (UniqueName: \"kubernetes.io/projected/7eb8479a-645c-40f7-852f-1b0fb72fa067-kube-api-access-27bgp\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.568134 master-0 kubenswrapper[15202]: I0319 09:51:48.568103 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7eb8479a-645c-40f7-852f-1b0fb72fa067-openstack-config-secret\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.569600 master-0 kubenswrapper[15202]: I0319 09:51:48.568836 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7eb8479a-645c-40f7-852f-1b0fb72fa067-openstack-config\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.570273 master-0 kubenswrapper[15202]: I0319 09:51:48.570233 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7eb8479a-645c-40f7-852f-1b0fb72fa067-openstack-config\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.571626 master-0 kubenswrapper[15202]: I0319 09:51:48.571581 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bde4d125-5422-48a5-809b-b7326315062c" (UID: "bde4d125-5422-48a5-809b-b7326315062c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:48.578192 master-0 kubenswrapper[15202]: I0319 09:51:48.572926 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb8479a-645c-40f7-852f-1b0fb72fa067-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.578192 master-0 kubenswrapper[15202]: I0319 09:51:48.577737 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde4d125-5422-48a5-809b-b7326315062c-kube-api-access-gsmv9" (OuterVolumeSpecName: "kube-api-access-gsmv9") pod "bde4d125-5422-48a5-809b-b7326315062c" (UID: "bde4d125-5422-48a5-809b-b7326315062c"). InnerVolumeSpecName "kube-api-access-gsmv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:48.588104 master-0 kubenswrapper[15202]: I0319 09:51:48.584343 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7eb8479a-645c-40f7-852f-1b0fb72fa067-openstack-config-secret\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.605670 master-0 kubenswrapper[15202]: I0319 09:51:48.600740 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27bgp\" (UniqueName: \"kubernetes.io/projected/7eb8479a-645c-40f7-852f-1b0fb72fa067-kube-api-access-27bgp\") pod \"openstackclient\" (UID: \"7eb8479a-645c-40f7-852f-1b0fb72fa067\") " pod="openstack/openstackclient" Mar 19 09:51:48.635520 master-0 kubenswrapper[15202]: I0319 09:51:48.635361 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85f97d8d64-dfwgh_bde4d125-5422-48a5-809b-b7326315062c/neutron-api/0.log" Mar 19 09:51:48.635520 master-0 kubenswrapper[15202]: I0319 09:51:48.635430 15202 generic.go:334] "Generic (PLEG): container finished" podID="bde4d125-5422-48a5-809b-b7326315062c" containerID="5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859" exitCode=137 Mar 19 09:51:48.635783 master-0 kubenswrapper[15202]: I0319 09:51:48.635527 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:51:48.636590 master-0 kubenswrapper[15202]: I0319 09:51:48.636544 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f97d8d64-dfwgh" Mar 19 09:51:48.638851 master-0 kubenswrapper[15202]: I0319 09:51:48.637445 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f97d8d64-dfwgh" event={"ID":"bde4d125-5422-48a5-809b-b7326315062c","Type":"ContainerDied","Data":"5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859"} Mar 19 09:51:48.638851 master-0 kubenswrapper[15202]: I0319 09:51:48.637821 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f97d8d64-dfwgh" event={"ID":"bde4d125-5422-48a5-809b-b7326315062c","Type":"ContainerDied","Data":"398e0e9ae83ff91111cde00109ab657c02c0a9a9f9134130abd316b594b5e31b"} Mar 19 09:51:48.638851 master-0 kubenswrapper[15202]: I0319 09:51:48.637851 15202 scope.go:117] "RemoveContainer" containerID="4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e" Mar 19 09:51:48.653671 master-0 kubenswrapper[15202]: I0319 09:51:48.653026 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:51:48.653891 master-0 kubenswrapper[15202]: I0319 09:51:48.653792 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c0cf75d-1106-4b10-9d2d-c0238d30cd70" podUID="7eb8479a-645c-40f7-852f-1b0fb72fa067" Mar 19 09:51:48.661411 master-0 kubenswrapper[15202]: I0319 09:51:48.661343 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bde4d125-5422-48a5-809b-b7326315062c" (UID: "bde4d125-5422-48a5-809b-b7326315062c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:48.666485 master-0 kubenswrapper[15202]: I0319 09:51:48.665393 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-config" (OuterVolumeSpecName: "config") pod "bde4d125-5422-48a5-809b-b7326315062c" (UID: "bde4d125-5422-48a5-809b-b7326315062c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:48.675661 master-0 kubenswrapper[15202]: I0319 09:51:48.672163 15202 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.675661 master-0 kubenswrapper[15202]: I0319 09:51:48.672221 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.675661 master-0 kubenswrapper[15202]: I0319 09:51:48.672237 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.675661 master-0 kubenswrapper[15202]: I0319 09:51:48.672252 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsmv9\" (UniqueName: \"kubernetes.io/projected/bde4d125-5422-48a5-809b-b7326315062c-kube-api-access-gsmv9\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.731784 master-0 kubenswrapper[15202]: I0319 09:51:48.731716 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bde4d125-5422-48a5-809b-b7326315062c" (UID: "bde4d125-5422-48a5-809b-b7326315062c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:48.740440 master-0 kubenswrapper[15202]: I0319 09:51:48.740384 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:51:48.750709 master-0 kubenswrapper[15202]: I0319 09:51:48.750091 15202 scope.go:117] "RemoveContainer" containerID="5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859" Mar 19 09:51:48.775464 master-0 kubenswrapper[15202]: I0319 09:51:48.775400 15202 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde4d125-5422-48a5-809b-b7326315062c-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.810610 master-0 kubenswrapper[15202]: I0319 09:51:48.809834 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77db675565-g4zz2" Mar 19 09:51:48.821406 master-0 kubenswrapper[15202]: I0319 09:51:48.820551 15202 scope.go:117] "RemoveContainer" containerID="4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e" Mar 19 09:51:48.822935 master-0 kubenswrapper[15202]: E0319 09:51:48.822896 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e\": container with ID starting with 4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e not found: ID does not exist" containerID="4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e" Mar 19 09:51:48.823003 master-0 kubenswrapper[15202]: I0319 09:51:48.822935 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e"} err="failed to get container status \"4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e\": rpc error: code = NotFound desc = could not find container \"4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e\": container with ID starting with 4fd7920c069ce6ba8241fd3b61e9f2d2846559b71bb5759f6f8354d1826e971e not found: ID does not exist" Mar 19 09:51:48.823003 master-0 kubenswrapper[15202]: I0319 09:51:48.822958 15202 scope.go:117] "RemoveContainer" containerID="5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859" Mar 19 09:51:48.824506 master-0 kubenswrapper[15202]: E0319 09:51:48.824429 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859\": container with ID starting with 5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859 not found: ID does not exist" containerID="5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859" Mar 19 09:51:48.824506 master-0 kubenswrapper[15202]: I0319 09:51:48.824461 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859"} err="failed to get container status \"5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859\": rpc error: code = NotFound desc = could not find container \"5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859\": container with ID starting with 5bfdcd3673917a34a2b5603e69bcf52ff6892db09398c819d14863349c2b6859 not found: ID does not exist" Mar 19 09:51:48.889314 master-0 kubenswrapper[15202]: I0319 09:51:48.889168 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-combined-ca-bundle\") pod \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " Mar 19 09:51:48.889535 master-0 kubenswrapper[15202]: I0319 09:51:48.889330 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config\") pod \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " Mar 19 09:51:48.889535 master-0 kubenswrapper[15202]: I0319 09:51:48.889414 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config-secret\") pod \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\" (UID: \"7c0cf75d-1106-4b10-9d2d-c0238d30cd70\") " Mar 19 09:51:48.891539 master-0 kubenswrapper[15202]: I0319 09:51:48.889990 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh9ft\" (UniqueName: \"kubernetes.io/projected/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-kube-api-access-gh9ft\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.894361 master-0 kubenswrapper[15202]: I0319 09:51:48.891942 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7c0cf75d-1106-4b10-9d2d-c0238d30cd70" (UID: "7c0cf75d-1106-4b10-9d2d-c0238d30cd70"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:48.896341 master-0 kubenswrapper[15202]: I0319 09:51:48.896292 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7c0cf75d-1106-4b10-9d2d-c0238d30cd70" (UID: "7c0cf75d-1106-4b10-9d2d-c0238d30cd70"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:48.913740 master-0 kubenswrapper[15202]: I0319 09:51:48.910301 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c0cf75d-1106-4b10-9d2d-c0238d30cd70" (UID: "7c0cf75d-1106-4b10-9d2d-c0238d30cd70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:48.998695 master-0 kubenswrapper[15202]: I0319 09:51:48.998264 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.998695 master-0 kubenswrapper[15202]: I0319 09:51:48.998329 15202 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:48.998695 master-0 kubenswrapper[15202]: I0319 09:51:48.998344 15202 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7c0cf75d-1106-4b10-9d2d-c0238d30cd70-openstack-config-secret\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:49.035138 master-0 kubenswrapper[15202]: I0319 09:51:49.034193 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cd95f9d78-s2fkv"] Mar 19 09:51:49.037006 master-0 kubenswrapper[15202]: I0319 09:51:49.035865 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cd95f9d78-s2fkv" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-httpd" containerID="cri-o://006abe7cbc3153c692320c85322b89c89f63c8ad1b8605e8c10f9ff7418e02cf" gracePeriod=30 Mar 19 09:51:49.037006 master-0 kubenswrapper[15202]: I0319 09:51:49.036130 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cd95f9d78-s2fkv" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-api" containerID="cri-o://7ff529613299b924c3d4cb1d4031e6538c493fe9beb8cb333c50be6f14dacc6a" gracePeriod=30 Mar 19 09:51:49.098652 master-0 kubenswrapper[15202]: I0319 09:51:49.097647 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85f97d8d64-dfwgh"] Mar 19 09:51:49.132712 master-0 kubenswrapper[15202]: I0319 09:51:49.132636 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-85f97d8d64-dfwgh"] Mar 19 09:51:49.245937 master-0 kubenswrapper[15202]: I0319 09:51:49.245892 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 09:51:49.675600 master-0 kubenswrapper[15202]: I0319 09:51:49.675300 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7eb8479a-645c-40f7-852f-1b0fb72fa067","Type":"ContainerStarted","Data":"bcbcdcb2f99f420612abbb4ade014bfc81780a64fbcf7eb8b313af64fd262787"} Mar 19 09:51:49.680485 master-0 kubenswrapper[15202]: I0319 09:51:49.677662 15202 generic.go:334] "Generic (PLEG): container finished" podID="9a13a111-1257-4963-8c30-51d28728449e" containerID="006abe7cbc3153c692320c85322b89c89f63c8ad1b8605e8c10f9ff7418e02cf" exitCode=0 Mar 19 09:51:49.680485 master-0 kubenswrapper[15202]: I0319 09:51:49.677744 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cd95f9d78-s2fkv" event={"ID":"9a13a111-1257-4963-8c30-51d28728449e","Type":"ContainerDied","Data":"006abe7cbc3153c692320c85322b89c89f63c8ad1b8605e8c10f9ff7418e02cf"} Mar 19 09:51:49.680485 master-0 kubenswrapper[15202]: I0319 09:51:49.679190 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:51:49.701488 master-0 kubenswrapper[15202]: I0319 09:51:49.699019 15202 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7c0cf75d-1106-4b10-9d2d-c0238d30cd70" podUID="7eb8479a-645c-40f7-852f-1b0fb72fa067" Mar 19 09:51:50.839077 master-0 kubenswrapper[15202]: I0319 09:51:50.836188 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0cf75d-1106-4b10-9d2d-c0238d30cd70" path="/var/lib/kubelet/pods/7c0cf75d-1106-4b10-9d2d-c0238d30cd70/volumes" Mar 19 09:51:50.839077 master-0 kubenswrapper[15202]: I0319 09:51:50.836804 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde4d125-5422-48a5-809b-b7326315062c" path="/var/lib/kubelet/pods/bde4d125-5422-48a5-809b-b7326315062c/volumes" Mar 19 09:51:51.713130 master-0 kubenswrapper[15202]: I0319 09:51:51.713064 15202 generic.go:334] "Generic (PLEG): container finished" podID="9a13a111-1257-4963-8c30-51d28728449e" containerID="7ff529613299b924c3d4cb1d4031e6538c493fe9beb8cb333c50be6f14dacc6a" exitCode=0 Mar 19 09:51:51.713130 master-0 kubenswrapper[15202]: I0319 09:51:51.713124 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cd95f9d78-s2fkv" event={"ID":"9a13a111-1257-4963-8c30-51d28728449e","Type":"ContainerDied","Data":"7ff529613299b924c3d4cb1d4031e6538c493fe9beb8cb333c50be6f14dacc6a"} Mar 19 09:51:51.713397 master-0 kubenswrapper[15202]: I0319 09:51:51.713156 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cd95f9d78-s2fkv" event={"ID":"9a13a111-1257-4963-8c30-51d28728449e","Type":"ContainerDied","Data":"79e3f9cfbfd95ceea9237898839f5bb3b9fef4b984a43e09f631cc348938fbf1"} Mar 19 09:51:51.713397 master-0 kubenswrapper[15202]: I0319 09:51:51.713169 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79e3f9cfbfd95ceea9237898839f5bb3b9fef4b984a43e09f631cc348938fbf1" Mar 19 09:51:51.726412 master-0 kubenswrapper[15202]: I0319 09:51:51.726356 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:51.782530 master-0 kubenswrapper[15202]: I0319 09:51:51.782429 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-combined-ca-bundle\") pod \"9a13a111-1257-4963-8c30-51d28728449e\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " Mar 19 09:51:51.782530 master-0 kubenswrapper[15202]: I0319 09:51:51.782538 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-httpd-config\") pod \"9a13a111-1257-4963-8c30-51d28728449e\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " Mar 19 09:51:51.782869 master-0 kubenswrapper[15202]: I0319 09:51:51.782644 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-config\") pod \"9a13a111-1257-4963-8c30-51d28728449e\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " Mar 19 09:51:51.782933 master-0 kubenswrapper[15202]: I0319 09:51:51.782911 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2hnn\" (UniqueName: \"kubernetes.io/projected/9a13a111-1257-4963-8c30-51d28728449e-kube-api-access-w2hnn\") pod \"9a13a111-1257-4963-8c30-51d28728449e\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " Mar 19 09:51:51.782983 master-0 kubenswrapper[15202]: I0319 09:51:51.782946 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-ovndb-tls-certs\") pod \"9a13a111-1257-4963-8c30-51d28728449e\" (UID: \"9a13a111-1257-4963-8c30-51d28728449e\") " Mar 19 09:51:51.788642 master-0 kubenswrapper[15202]: I0319 09:51:51.788572 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9a13a111-1257-4963-8c30-51d28728449e" (UID: "9a13a111-1257-4963-8c30-51d28728449e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:51.791852 master-0 kubenswrapper[15202]: I0319 09:51:51.791830 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a13a111-1257-4963-8c30-51d28728449e-kube-api-access-w2hnn" (OuterVolumeSpecName: "kube-api-access-w2hnn") pod "9a13a111-1257-4963-8c30-51d28728449e" (UID: "9a13a111-1257-4963-8c30-51d28728449e"). InnerVolumeSpecName "kube-api-access-w2hnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:51.864356 master-0 kubenswrapper[15202]: I0319 09:51:51.863925 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a13a111-1257-4963-8c30-51d28728449e" (UID: "9a13a111-1257-4963-8c30-51d28728449e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:51.867864 master-0 kubenswrapper[15202]: I0319 09:51:51.867808 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-config" (OuterVolumeSpecName: "config") pod "9a13a111-1257-4963-8c30-51d28728449e" (UID: "9a13a111-1257-4963-8c30-51d28728449e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:51.889314 master-0 kubenswrapper[15202]: I0319 09:51:51.889257 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2hnn\" (UniqueName: \"kubernetes.io/projected/9a13a111-1257-4963-8c30-51d28728449e-kube-api-access-w2hnn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:51.891459 master-0 kubenswrapper[15202]: I0319 09:51:51.891402 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:51.891459 master-0 kubenswrapper[15202]: I0319 09:51:51.891440 15202 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:51.891612 master-0 kubenswrapper[15202]: I0319 09:51:51.891457 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:51.913344 master-0 kubenswrapper[15202]: I0319 09:51:51.913258 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9a13a111-1257-4963-8c30-51d28728449e" (UID: "9a13a111-1257-4963-8c30-51d28728449e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:51.993426 master-0 kubenswrapper[15202]: I0319 09:51:51.993368 15202 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a13a111-1257-4963-8c30-51d28728449e-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:51:52.675946 master-0 kubenswrapper[15202]: I0319 09:51:52.675830 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-77dc968fc8-nnkkj"] Mar 19 09:51:52.676337 master-0 kubenswrapper[15202]: E0319 09:51:52.676315 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-api" Mar 19 09:51:52.676337 master-0 kubenswrapper[15202]: I0319 09:51:52.676333 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-api" Mar 19 09:51:52.676433 master-0 kubenswrapper[15202]: E0319 09:51:52.676352 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-httpd" Mar 19 09:51:52.676433 master-0 kubenswrapper[15202]: I0319 09:51:52.676359 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-httpd" Mar 19 09:51:52.676433 master-0 kubenswrapper[15202]: E0319 09:51:52.676403 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-api" Mar 19 09:51:52.676433 master-0 kubenswrapper[15202]: I0319 09:51:52.676410 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-api" Mar 19 09:51:52.676433 master-0 kubenswrapper[15202]: E0319 09:51:52.676424 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-httpd" Mar 19 09:51:52.676433 master-0 kubenswrapper[15202]: I0319 09:51:52.676430 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-httpd" Mar 19 09:51:52.677018 master-0 kubenswrapper[15202]: I0319 09:51:52.676647 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-httpd" Mar 19 09:51:52.677018 master-0 kubenswrapper[15202]: I0319 09:51:52.676665 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde4d125-5422-48a5-809b-b7326315062c" containerName="neutron-api" Mar 19 09:51:52.677018 master-0 kubenswrapper[15202]: I0319 09:51:52.676695 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-httpd" Mar 19 09:51:52.677018 master-0 kubenswrapper[15202]: I0319 09:51:52.676704 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a13a111-1257-4963-8c30-51d28728449e" containerName="neutron-api" Mar 19 09:51:52.678271 master-0 kubenswrapper[15202]: I0319 09:51:52.678247 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.697051 master-0 kubenswrapper[15202]: I0319 09:51:52.696989 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 09:51:52.697379 master-0 kubenswrapper[15202]: I0319 09:51:52.697337 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 09:51:52.697438 master-0 kubenswrapper[15202]: I0319 09:51:52.697394 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 09:51:52.714605 master-0 kubenswrapper[15202]: I0319 09:51:52.713671 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77dc968fc8-nnkkj"] Mar 19 09:51:52.728059 master-0 kubenswrapper[15202]: I0319 09:51:52.727960 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cd95f9d78-s2fkv" Mar 19 09:51:52.818137 master-0 kubenswrapper[15202]: I0319 09:51:52.817322 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-public-tls-certs\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.819414 master-0 kubenswrapper[15202]: I0319 09:51:52.819376 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/180cd549-4f02-4a40-875d-5d44423f0b2f-log-httpd\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.820346 master-0 kubenswrapper[15202]: I0319 09:51:52.820318 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-combined-ca-bundle\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.820505 master-0 kubenswrapper[15202]: I0319 09:51:52.820482 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-internal-tls-certs\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.820893 master-0 kubenswrapper[15202]: I0319 09:51:52.820873 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/180cd549-4f02-4a40-875d-5d44423f0b2f-etc-swift\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.821086 master-0 kubenswrapper[15202]: I0319 09:51:52.821065 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-config-data\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.821605 master-0 kubenswrapper[15202]: I0319 09:51:52.821581 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5hv\" (UniqueName: \"kubernetes.io/projected/180cd549-4f02-4a40-875d-5d44423f0b2f-kube-api-access-vn5hv\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.821818 master-0 kubenswrapper[15202]: I0319 09:51:52.821793 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/180cd549-4f02-4a40-875d-5d44423f0b2f-run-httpd\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.835764 master-0 kubenswrapper[15202]: I0319 09:51:52.835706 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cd95f9d78-s2fkv"] Mar 19 09:51:52.842936 master-0 kubenswrapper[15202]: I0319 09:51:52.842873 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7cd95f9d78-s2fkv"] Mar 19 09:51:52.926995 master-0 kubenswrapper[15202]: I0319 09:51:52.926571 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/180cd549-4f02-4a40-875d-5d44423f0b2f-run-httpd\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.926995 master-0 kubenswrapper[15202]: I0319 09:51:52.926895 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-public-tls-certs\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927755 master-0 kubenswrapper[15202]: I0319 09:51:52.927104 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/180cd549-4f02-4a40-875d-5d44423f0b2f-run-httpd\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927755 master-0 kubenswrapper[15202]: I0319 09:51:52.927304 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/180cd549-4f02-4a40-875d-5d44423f0b2f-log-httpd\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927755 master-0 kubenswrapper[15202]: I0319 09:51:52.927369 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-combined-ca-bundle\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927755 master-0 kubenswrapper[15202]: I0319 09:51:52.927396 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-internal-tls-certs\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927947 master-0 kubenswrapper[15202]: I0319 09:51:52.927752 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/180cd549-4f02-4a40-875d-5d44423f0b2f-etc-swift\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927947 master-0 kubenswrapper[15202]: I0319 09:51:52.927800 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/180cd549-4f02-4a40-875d-5d44423f0b2f-log-httpd\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.927947 master-0 kubenswrapper[15202]: I0319 09:51:52.927934 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-config-data\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.928077 master-0 kubenswrapper[15202]: I0319 09:51:52.928050 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5hv\" (UniqueName: \"kubernetes.io/projected/180cd549-4f02-4a40-875d-5d44423f0b2f-kube-api-access-vn5hv\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.935653 master-0 kubenswrapper[15202]: I0319 09:51:52.935606 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-config-data\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.935797 master-0 kubenswrapper[15202]: I0319 09:51:52.935665 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-combined-ca-bundle\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.944890 master-0 kubenswrapper[15202]: I0319 09:51:52.944824 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-internal-tls-certs\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.949035 master-0 kubenswrapper[15202]: I0319 09:51:52.948971 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/180cd549-4f02-4a40-875d-5d44423f0b2f-etc-swift\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.949997 master-0 kubenswrapper[15202]: I0319 09:51:52.949961 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/180cd549-4f02-4a40-875d-5d44423f0b2f-public-tls-certs\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:52.981214 master-0 kubenswrapper[15202]: I0319 09:51:52.981074 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5hv\" (UniqueName: \"kubernetes.io/projected/180cd549-4f02-4a40-875d-5d44423f0b2f-kube-api-access-vn5hv\") pod \"swift-proxy-77dc968fc8-nnkkj\" (UID: \"180cd549-4f02-4a40-875d-5d44423f0b2f\") " pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:53.016627 master-0 kubenswrapper[15202]: I0319 09:51:53.016291 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:53.548783 master-0 kubenswrapper[15202]: I0319 09:51:53.548721 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77dc968fc8-nnkkj"] Mar 19 09:51:53.748392 master-0 kubenswrapper[15202]: I0319 09:51:53.748329 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77dc968fc8-nnkkj" event={"ID":"180cd549-4f02-4a40-875d-5d44423f0b2f","Type":"ContainerStarted","Data":"f310c0a22e8c7a16367dcaa27705cbf28b5e14d15b0944dd33a5490cf5c500f1"} Mar 19 09:51:54.786877 master-0 kubenswrapper[15202]: I0319 09:51:54.782203 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77dc968fc8-nnkkj" event={"ID":"180cd549-4f02-4a40-875d-5d44423f0b2f","Type":"ContainerStarted","Data":"ac6f899ac233efcf1bb257bca5fea3c2bcac334b3b7b7e170f0e892de7f73d4d"} Mar 19 09:51:54.786877 master-0 kubenswrapper[15202]: I0319 09:51:54.782264 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77dc968fc8-nnkkj" event={"ID":"180cd549-4f02-4a40-875d-5d44423f0b2f","Type":"ContainerStarted","Data":"56e56f74b69456a6b16425311d25f733d82631e7df2ed0cf58983797e5d419b8"} Mar 19 09:51:54.786877 master-0 kubenswrapper[15202]: I0319 09:51:54.782426 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:54.830908 master-0 kubenswrapper[15202]: I0319 09:51:54.829639 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-77dc968fc8-nnkkj" podStartSLOduration=2.829614123 podStartE2EDuration="2.829614123s" podCreationTimestamp="2026-03-19 09:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:54.817593227 +0000 UTC m=+1632.203008053" watchObservedRunningTime="2026-03-19 09:51:54.829614123 +0000 UTC m=+1632.215028939" Mar 19 09:51:54.852567 master-0 kubenswrapper[15202]: I0319 09:51:54.850824 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a13a111-1257-4963-8c30-51d28728449e" path="/var/lib/kubelet/pods/9a13a111-1257-4963-8c30-51d28728449e/volumes" Mar 19 09:51:55.796702 master-0 kubenswrapper[15202]: I0319 09:51:55.796642 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:51:58.030653 master-0 kubenswrapper[15202]: I0319 09:51:58.030551 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:52:03.025163 master-0 kubenswrapper[15202]: I0319 09:52:03.025073 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77dc968fc8-nnkkj" Mar 19 09:52:05.967253 master-0 kubenswrapper[15202]: I0319 09:52:05.966973 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7eb8479a-645c-40f7-852f-1b0fb72fa067","Type":"ContainerStarted","Data":"8970abfe2e8a30faf4b2eec98706d9decf8a0730e91cc9937c2326de290fce34"} Mar 19 09:52:07.260700 master-0 kubenswrapper[15202]: I0319 09:52:07.260566 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.100182104 podStartE2EDuration="19.260532576s" podCreationTimestamp="2026-03-19 09:51:48 +0000 UTC" firstStartedPulling="2026-03-19 09:51:49.26909419 +0000 UTC m=+1626.654509006" lastFinishedPulling="2026-03-19 09:52:04.429444662 +0000 UTC m=+1641.814859478" observedRunningTime="2026-03-19 09:52:07.257808049 +0000 UTC m=+1644.643222865" watchObservedRunningTime="2026-03-19 09:52:07.260532576 +0000 UTC m=+1644.645947422" Mar 19 09:52:08.943946 master-0 kubenswrapper[15202]: I0319 09:52:08.943266 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-j7tk6"] Mar 19 09:52:08.945623 master-0 kubenswrapper[15202]: I0319 09:52:08.945559 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.022095 master-0 kubenswrapper[15202]: I0319 09:52:09.022025 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j7tk6"] Mar 19 09:52:09.043378 master-0 kubenswrapper[15202]: I0319 09:52:09.043310 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9856624e-09c3-4b1c-bf33-3f57ba441335-operator-scripts\") pod \"nova-api-db-create-j7tk6\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.043672 master-0 kubenswrapper[15202]: I0319 09:52:09.043582 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6l7d\" (UniqueName: \"kubernetes.io/projected/9856624e-09c3-4b1c-bf33-3f57ba441335-kube-api-access-m6l7d\") pod \"nova-api-db-create-j7tk6\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.146812 master-0 kubenswrapper[15202]: I0319 09:52:09.146726 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9856624e-09c3-4b1c-bf33-3f57ba441335-operator-scripts\") pod \"nova-api-db-create-j7tk6\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.147044 master-0 kubenswrapper[15202]: I0319 09:52:09.146851 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6l7d\" (UniqueName: \"kubernetes.io/projected/9856624e-09c3-4b1c-bf33-3f57ba441335-kube-api-access-m6l7d\") pod \"nova-api-db-create-j7tk6\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.147701 master-0 kubenswrapper[15202]: I0319 09:52:09.147640 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9856624e-09c3-4b1c-bf33-3f57ba441335-operator-scripts\") pod \"nova-api-db-create-j7tk6\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.639239 master-0 kubenswrapper[15202]: I0319 09:52:09.639174 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6l7d\" (UniqueName: \"kubernetes.io/projected/9856624e-09c3-4b1c-bf33-3f57ba441335-kube-api-access-m6l7d\") pod \"nova-api-db-create-j7tk6\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:09.875424 master-0 kubenswrapper[15202]: I0319 09:52:09.875317 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:10.835812 master-0 kubenswrapper[15202]: I0319 09:52:10.834783 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zlb6q"] Mar 19 09:52:10.836799 master-0 kubenswrapper[15202]: I0319 09:52:10.836606 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:10.840024 master-0 kubenswrapper[15202]: I0319 09:52:10.839052 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j7tk6"] Mar 19 09:52:10.899127 master-0 kubenswrapper[15202]: I0319 09:52:10.898815 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt9qt\" (UniqueName: \"kubernetes.io/projected/fcea5022-9090-4c15-8a38-91e20e844584-kube-api-access-tt9qt\") pod \"nova-cell0-db-create-zlb6q\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:10.899563 master-0 kubenswrapper[15202]: I0319 09:52:10.899517 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcea5022-9090-4c15-8a38-91e20e844584-operator-scripts\") pod \"nova-cell0-db-create-zlb6q\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:11.001609 master-0 kubenswrapper[15202]: I0319 09:52:11.001552 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcea5022-9090-4c15-8a38-91e20e844584-operator-scripts\") pod \"nova-cell0-db-create-zlb6q\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:11.001920 master-0 kubenswrapper[15202]: I0319 09:52:11.001899 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt9qt\" (UniqueName: \"kubernetes.io/projected/fcea5022-9090-4c15-8a38-91e20e844584-kube-api-access-tt9qt\") pod \"nova-cell0-db-create-zlb6q\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:11.002547 master-0 kubenswrapper[15202]: I0319 09:52:11.002507 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcea5022-9090-4c15-8a38-91e20e844584-operator-scripts\") pod \"nova-cell0-db-create-zlb6q\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:11.028591 master-0 kubenswrapper[15202]: I0319 09:52:11.028534 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j7tk6" event={"ID":"9856624e-09c3-4b1c-bf33-3f57ba441335","Type":"ContainerStarted","Data":"7ede0fbaa835e3e890a8a8d6811f26588c8548d15111d6f2f74a44a8263bc666"} Mar 19 09:52:11.360780 master-0 kubenswrapper[15202]: I0319 09:52:11.360708 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:52:11.361077 master-0 kubenswrapper[15202]: I0319 09:52:11.361038 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3a5fd-default-external-api-0" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-log" containerID="cri-o://66072e97e8c03addc037a16bdd832c64e2ef534fcdab06a5df8ead3d9c8d05fb" gracePeriod=30 Mar 19 09:52:11.361200 master-0 kubenswrapper[15202]: I0319 09:52:11.361128 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3a5fd-default-external-api-0" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-httpd" containerID="cri-o://369ad3b9f0e4ef9c96b7b0fdf9c622246a6c2506dd02977cc69001dd6fe027e7" gracePeriod=30 Mar 19 09:52:11.372281 master-0 kubenswrapper[15202]: I0319 09:52:11.372207 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8395-account-create-update-8nzrz"] Mar 19 09:52:11.374794 master-0 kubenswrapper[15202]: I0319 09:52:11.374747 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:11.380491 master-0 kubenswrapper[15202]: I0319 09:52:11.377418 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 09:52:11.396271 master-0 kubenswrapper[15202]: I0319 09:52:11.396231 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zlb6q"] Mar 19 09:52:11.408382 master-0 kubenswrapper[15202]: I0319 09:52:11.408326 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c95d54-8750-4daf-a981-9544bc6b1fc7-operator-scripts\") pod \"nova-api-8395-account-create-update-8nzrz\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:11.408934 master-0 kubenswrapper[15202]: I0319 09:52:11.408836 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m7fz\" (UniqueName: \"kubernetes.io/projected/87c95d54-8750-4daf-a981-9544bc6b1fc7-kube-api-access-7m7fz\") pod \"nova-api-8395-account-create-update-8nzrz\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:11.511119 master-0 kubenswrapper[15202]: I0319 09:52:11.511040 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7fz\" (UniqueName: \"kubernetes.io/projected/87c95d54-8750-4daf-a981-9544bc6b1fc7-kube-api-access-7m7fz\") pod \"nova-api-8395-account-create-update-8nzrz\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:11.511402 master-0 kubenswrapper[15202]: I0319 09:52:11.511187 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c95d54-8750-4daf-a981-9544bc6b1fc7-operator-scripts\") pod \"nova-api-8395-account-create-update-8nzrz\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:11.512954 master-0 kubenswrapper[15202]: I0319 09:52:11.512891 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c95d54-8750-4daf-a981-9544bc6b1fc7-operator-scripts\") pod \"nova-api-8395-account-create-update-8nzrz\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:11.669459 master-0 kubenswrapper[15202]: I0319 09:52:11.669283 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8395-account-create-update-8nzrz"] Mar 19 09:52:12.045941 master-0 kubenswrapper[15202]: I0319 09:52:12.044889 15202 generic.go:334] "Generic (PLEG): container finished" podID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerID="66072e97e8c03addc037a16bdd832c64e2ef534fcdab06a5df8ead3d9c8d05fb" exitCode=143 Mar 19 09:52:12.045941 master-0 kubenswrapper[15202]: I0319 09:52:12.044979 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"20a9e839-7eb3-4ba6-bc63-7220be59d238","Type":"ContainerDied","Data":"66072e97e8c03addc037a16bdd832c64e2ef534fcdab06a5df8ead3d9c8d05fb"} Mar 19 09:52:12.047279 master-0 kubenswrapper[15202]: I0319 09:52:12.047225 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j7tk6" event={"ID":"9856624e-09c3-4b1c-bf33-3f57ba441335","Type":"ContainerStarted","Data":"84a5b0ed57e92810141c8a1b5a49d2f7d9b9902da916052304c5505bf8eabb4b"} Mar 19 09:52:12.480205 master-0 kubenswrapper[15202]: I0319 09:52:12.480152 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m7fz\" (UniqueName: \"kubernetes.io/projected/87c95d54-8750-4daf-a981-9544bc6b1fc7-kube-api-access-7m7fz\") pod \"nova-api-8395-account-create-update-8nzrz\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:12.480414 master-0 kubenswrapper[15202]: I0319 09:52:12.480294 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt9qt\" (UniqueName: \"kubernetes.io/projected/fcea5022-9090-4c15-8a38-91e20e844584-kube-api-access-tt9qt\") pod \"nova-cell0-db-create-zlb6q\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:12.615738 master-0 kubenswrapper[15202]: I0319 09:52:12.615655 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:12.692491 master-0 kubenswrapper[15202]: I0319 09:52:12.691723 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:13.006327 master-0 kubenswrapper[15202]: I0319 09:52:13.000152 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-j7tk6" podStartSLOduration=6.000127164 podStartE2EDuration="6.000127164s" podCreationTimestamp="2026-03-19 09:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:12.989688576 +0000 UTC m=+1650.375103392" watchObservedRunningTime="2026-03-19 09:52:13.000127164 +0000 UTC m=+1650.385541980" Mar 19 09:52:13.193238 master-0 kubenswrapper[15202]: I0319 09:52:13.191498 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jshcc"] Mar 19 09:52:13.193913 master-0 kubenswrapper[15202]: I0319 09:52:13.193806 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.329746 master-0 kubenswrapper[15202]: I0319 09:52:13.328691 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jshcc"] Mar 19 09:52:13.344500 master-0 kubenswrapper[15202]: I0319 09:52:13.344447 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zlb6q"] Mar 19 09:52:13.354576 master-0 kubenswrapper[15202]: I0319 09:52:13.354461 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svrc9\" (UniqueName: \"kubernetes.io/projected/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-kube-api-access-svrc9\") pod \"nova-cell1-db-create-jshcc\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.354832 master-0 kubenswrapper[15202]: I0319 09:52:13.354632 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-operator-scripts\") pod \"nova-cell1-db-create-jshcc\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.358011 master-0 kubenswrapper[15202]: I0319 09:52:13.357958 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8395-account-create-update-8nzrz"] Mar 19 09:52:13.404574 master-0 kubenswrapper[15202]: I0319 09:52:13.401163 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:52:13.404574 master-0 kubenswrapper[15202]: I0319 09:52:13.401508 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3a5fd-default-internal-api-0" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-log" containerID="cri-o://98854ebdc204abcd4cff2dac5ca4f7526aadfe02d73a0ff60ff32c560e6899db" gracePeriod=30 Mar 19 09:52:13.404574 master-0 kubenswrapper[15202]: I0319 09:52:13.401672 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3a5fd-default-internal-api-0" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-httpd" containerID="cri-o://df5424396812996687c783a3b69bbe79ddcfc74205891ffc7ddd501a9b5f7d01" gracePeriod=30 Mar 19 09:52:13.456507 master-0 kubenswrapper[15202]: I0319 09:52:13.456421 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svrc9\" (UniqueName: \"kubernetes.io/projected/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-kube-api-access-svrc9\") pod \"nova-cell1-db-create-jshcc\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.456671 master-0 kubenswrapper[15202]: I0319 09:52:13.456587 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-operator-scripts\") pod \"nova-cell1-db-create-jshcc\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.457939 master-0 kubenswrapper[15202]: I0319 09:52:13.457902 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-operator-scripts\") pod \"nova-cell1-db-create-jshcc\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.659520 master-0 kubenswrapper[15202]: I0319 09:52:13.659426 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svrc9\" (UniqueName: \"kubernetes.io/projected/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-kube-api-access-svrc9\") pod \"nova-cell1-db-create-jshcc\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:13.838303 master-0 kubenswrapper[15202]: I0319 09:52:13.838215 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:14.077595 master-0 kubenswrapper[15202]: I0319 09:52:14.077539 15202 generic.go:334] "Generic (PLEG): container finished" podID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerID="98854ebdc204abcd4cff2dac5ca4f7526aadfe02d73a0ff60ff32c560e6899db" exitCode=143 Mar 19 09:52:14.077714 master-0 kubenswrapper[15202]: I0319 09:52:14.077626 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"496c49f4-9bde-41e5-ab83-477abcf1c5ef","Type":"ContainerDied","Data":"98854ebdc204abcd4cff2dac5ca4f7526aadfe02d73a0ff60ff32c560e6899db"} Mar 19 09:52:14.097190 master-0 kubenswrapper[15202]: I0319 09:52:14.097120 15202 generic.go:334] "Generic (PLEG): container finished" podID="9856624e-09c3-4b1c-bf33-3f57ba441335" containerID="84a5b0ed57e92810141c8a1b5a49d2f7d9b9902da916052304c5505bf8eabb4b" exitCode=0 Mar 19 09:52:14.097557 master-0 kubenswrapper[15202]: I0319 09:52:14.097228 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j7tk6" event={"ID":"9856624e-09c3-4b1c-bf33-3f57ba441335","Type":"ContainerDied","Data":"84a5b0ed57e92810141c8a1b5a49d2f7d9b9902da916052304c5505bf8eabb4b"} Mar 19 09:52:14.099441 master-0 kubenswrapper[15202]: I0319 09:52:14.099403 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8395-account-create-update-8nzrz" event={"ID":"87c95d54-8750-4daf-a981-9544bc6b1fc7","Type":"ContainerStarted","Data":"ab0b41eddb547d86b9031856e7c1c3966231fef520771fc062128a8713086438"} Mar 19 09:52:14.099441 master-0 kubenswrapper[15202]: I0319 09:52:14.099434 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8395-account-create-update-8nzrz" event={"ID":"87c95d54-8750-4daf-a981-9544bc6b1fc7","Type":"ContainerStarted","Data":"bc7f492916019ee85d515404beb4bad79222f3d3bd1365adf863ffde097d7ebc"} Mar 19 09:52:14.100811 master-0 kubenswrapper[15202]: I0319 09:52:14.100778 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zlb6q" event={"ID":"fcea5022-9090-4c15-8a38-91e20e844584","Type":"ContainerStarted","Data":"355c6401912de6e8deb25865a1a9284a576d2088603c72a2deeb1812456a827a"} Mar 19 09:52:14.100811 master-0 kubenswrapper[15202]: I0319 09:52:14.100803 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zlb6q" event={"ID":"fcea5022-9090-4c15-8a38-91e20e844584","Type":"ContainerStarted","Data":"a8381bfa2b5d305860e9a681428c2200be72b2ee51a68530feb2c2531cde73f9"} Mar 19 09:52:14.451646 master-0 kubenswrapper[15202]: I0319 09:52:14.451556 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:52:14.528597 master-0 kubenswrapper[15202]: I0319 09:52:14.528544 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67c9b9475d-ksb2w" Mar 19 09:52:15.113905 master-0 kubenswrapper[15202]: I0319 09:52:15.113770 15202 generic.go:334] "Generic (PLEG): container finished" podID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerID="369ad3b9f0e4ef9c96b7b0fdf9c622246a6c2506dd02977cc69001dd6fe027e7" exitCode=0 Mar 19 09:52:15.113905 master-0 kubenswrapper[15202]: I0319 09:52:15.113871 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"20a9e839-7eb3-4ba6-bc63-7220be59d238","Type":"ContainerDied","Data":"369ad3b9f0e4ef9c96b7b0fdf9c622246a6c2506dd02977cc69001dd6fe027e7"} Mar 19 09:52:15.554095 master-0 kubenswrapper[15202]: I0319 09:52:15.554052 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:15.633417 master-0 kubenswrapper[15202]: I0319 09:52:15.633347 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9856624e-09c3-4b1c-bf33-3f57ba441335-operator-scripts\") pod \"9856624e-09c3-4b1c-bf33-3f57ba441335\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " Mar 19 09:52:15.633689 master-0 kubenswrapper[15202]: I0319 09:52:15.633667 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6l7d\" (UniqueName: \"kubernetes.io/projected/9856624e-09c3-4b1c-bf33-3f57ba441335-kube-api-access-m6l7d\") pod \"9856624e-09c3-4b1c-bf33-3f57ba441335\" (UID: \"9856624e-09c3-4b1c-bf33-3f57ba441335\") " Mar 19 09:52:15.633862 master-0 kubenswrapper[15202]: I0319 09:52:15.633802 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9856624e-09c3-4b1c-bf33-3f57ba441335-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9856624e-09c3-4b1c-bf33-3f57ba441335" (UID: "9856624e-09c3-4b1c-bf33-3f57ba441335"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:15.634192 master-0 kubenswrapper[15202]: I0319 09:52:15.634158 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9856624e-09c3-4b1c-bf33-3f57ba441335-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:15.637052 master-0 kubenswrapper[15202]: I0319 09:52:15.636995 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9856624e-09c3-4b1c-bf33-3f57ba441335-kube-api-access-m6l7d" (OuterVolumeSpecName: "kube-api-access-m6l7d") pod "9856624e-09c3-4b1c-bf33-3f57ba441335" (UID: "9856624e-09c3-4b1c-bf33-3f57ba441335"). InnerVolumeSpecName "kube-api-access-m6l7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:15.737042 master-0 kubenswrapper[15202]: I0319 09:52:15.736425 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6l7d\" (UniqueName: \"kubernetes.io/projected/9856624e-09c3-4b1c-bf33-3f57ba441335-kube-api-access-m6l7d\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:16.139296 master-0 kubenswrapper[15202]: I0319 09:52:16.139072 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j7tk6" event={"ID":"9856624e-09c3-4b1c-bf33-3f57ba441335","Type":"ContainerDied","Data":"7ede0fbaa835e3e890a8a8d6811f26588c8548d15111d6f2f74a44a8263bc666"} Mar 19 09:52:16.139296 master-0 kubenswrapper[15202]: I0319 09:52:16.139128 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ede0fbaa835e3e890a8a8d6811f26588c8548d15111d6f2f74a44a8263bc666" Mar 19 09:52:16.139296 master-0 kubenswrapper[15202]: I0319 09:52:16.139196 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j7tk6" Mar 19 09:52:16.209274 master-0 kubenswrapper[15202]: I0319 09:52:16.209200 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ea37-account-create-update-c8nf7"] Mar 19 09:52:16.210496 master-0 kubenswrapper[15202]: E0319 09:52:16.210055 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9856624e-09c3-4b1c-bf33-3f57ba441335" containerName="mariadb-database-create" Mar 19 09:52:16.210496 master-0 kubenswrapper[15202]: I0319 09:52:16.210130 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9856624e-09c3-4b1c-bf33-3f57ba441335" containerName="mariadb-database-create" Mar 19 09:52:16.210955 master-0 kubenswrapper[15202]: I0319 09:52:16.210719 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9856624e-09c3-4b1c-bf33-3f57ba441335" containerName="mariadb-database-create" Mar 19 09:52:16.212110 master-0 kubenswrapper[15202]: I0319 09:52:16.212070 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:16.214633 master-0 kubenswrapper[15202]: I0319 09:52:16.214582 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 09:52:16.378492 master-0 kubenswrapper[15202]: I0319 09:52:16.377255 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea37-account-create-update-c8nf7"] Mar 19 09:52:16.399571 master-0 kubenswrapper[15202]: I0319 09:52:16.399318 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jshcc"] Mar 19 09:52:17.153319 master-0 kubenswrapper[15202]: I0319 09:52:17.153050 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jshcc" event={"ID":"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db","Type":"ContainerStarted","Data":"4018b0ae4fb460db1cbc894596cf06ad4ce784a13810c83cbfd3e89536b62b70"} Mar 19 09:52:17.153319 master-0 kubenswrapper[15202]: I0319 09:52:17.153138 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jshcc" event={"ID":"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db","Type":"ContainerStarted","Data":"cf2d657708fee526b5a48d06d71611b574133e8a5aedcf646d11d554b05c92c0"} Mar 19 09:52:17.154870 master-0 kubenswrapper[15202]: I0319 09:52:17.154670 15202 generic.go:334] "Generic (PLEG): container finished" podID="87c95d54-8750-4daf-a981-9544bc6b1fc7" containerID="ab0b41eddb547d86b9031856e7c1c3966231fef520771fc062128a8713086438" exitCode=0 Mar 19 09:52:17.154870 master-0 kubenswrapper[15202]: I0319 09:52:17.154720 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8395-account-create-update-8nzrz" event={"ID":"87c95d54-8750-4daf-a981-9544bc6b1fc7","Type":"ContainerDied","Data":"ab0b41eddb547d86b9031856e7c1c3966231fef520771fc062128a8713086438"} Mar 19 09:52:17.156337 master-0 kubenswrapper[15202]: I0319 09:52:17.156289 15202 generic.go:334] "Generic (PLEG): container finished" podID="fcea5022-9090-4c15-8a38-91e20e844584" containerID="355c6401912de6e8deb25865a1a9284a576d2088603c72a2deeb1812456a827a" exitCode=0 Mar 19 09:52:17.156503 master-0 kubenswrapper[15202]: I0319 09:52:17.156386 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zlb6q" event={"ID":"fcea5022-9090-4c15-8a38-91e20e844584","Type":"ContainerDied","Data":"355c6401912de6e8deb25865a1a9284a576d2088603c72a2deeb1812456a827a"} Mar 19 09:52:17.158496 master-0 kubenswrapper[15202]: I0319 09:52:17.158452 15202 generic.go:334] "Generic (PLEG): container finished" podID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerID="df5424396812996687c783a3b69bbe79ddcfc74205891ffc7ddd501a9b5f7d01" exitCode=0 Mar 19 09:52:17.158496 master-0 kubenswrapper[15202]: I0319 09:52:17.158480 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"496c49f4-9bde-41e5-ab83-477abcf1c5ef","Type":"ContainerDied","Data":"df5424396812996687c783a3b69bbe79ddcfc74205891ffc7ddd501a9b5f7d01"} Mar 19 09:52:17.973854 master-0 kubenswrapper[15202]: I0319 09:52:17.972787 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8395-account-create-update-8nzrz" podStartSLOduration=7.972754445 podStartE2EDuration="7.972754445s" podCreationTimestamp="2026-03-19 09:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:17.961341004 +0000 UTC m=+1655.346755830" watchObservedRunningTime="2026-03-19 09:52:17.972754445 +0000 UTC m=+1655.358169261" Mar 19 09:52:18.420825 master-0 kubenswrapper[15202]: I0319 09:52:18.420501 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h86v5\" (UniqueName: \"kubernetes.io/projected/6898ed5b-562b-415f-93f6-ddf0c1e01558-kube-api-access-h86v5\") pod \"nova-cell0-ea37-account-create-update-c8nf7\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:18.421327 master-0 kubenswrapper[15202]: I0319 09:52:18.420936 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6898ed5b-562b-415f-93f6-ddf0c1e01558-operator-scripts\") pod \"nova-cell0-ea37-account-create-update-c8nf7\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:18.528455 master-0 kubenswrapper[15202]: I0319 09:52:18.523507 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6898ed5b-562b-415f-93f6-ddf0c1e01558-operator-scripts\") pod \"nova-cell0-ea37-account-create-update-c8nf7\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:18.528455 master-0 kubenswrapper[15202]: I0319 09:52:18.523692 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h86v5\" (UniqueName: \"kubernetes.io/projected/6898ed5b-562b-415f-93f6-ddf0c1e01558-kube-api-access-h86v5\") pod \"nova-cell0-ea37-account-create-update-c8nf7\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:18.528455 master-0 kubenswrapper[15202]: I0319 09:52:18.525898 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6898ed5b-562b-415f-93f6-ddf0c1e01558-operator-scripts\") pod \"nova-cell0-ea37-account-create-update-c8nf7\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:18.730734 master-0 kubenswrapper[15202]: I0319 09:52:18.730641 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:18.743164 master-0 kubenswrapper[15202]: I0319 09:52:18.743074 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:18.851971 master-0 kubenswrapper[15202]: I0319 09:52:18.851909 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c95d54-8750-4daf-a981-9544bc6b1fc7-operator-scripts\") pod \"87c95d54-8750-4daf-a981-9544bc6b1fc7\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " Mar 19 09:52:18.852203 master-0 kubenswrapper[15202]: I0319 09:52:18.851993 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt9qt\" (UniqueName: \"kubernetes.io/projected/fcea5022-9090-4c15-8a38-91e20e844584-kube-api-access-tt9qt\") pod \"fcea5022-9090-4c15-8a38-91e20e844584\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " Mar 19 09:52:18.852295 master-0 kubenswrapper[15202]: I0319 09:52:18.852269 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcea5022-9090-4c15-8a38-91e20e844584-operator-scripts\") pod \"fcea5022-9090-4c15-8a38-91e20e844584\" (UID: \"fcea5022-9090-4c15-8a38-91e20e844584\") " Mar 19 09:52:18.852341 master-0 kubenswrapper[15202]: I0319 09:52:18.852331 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m7fz\" (UniqueName: \"kubernetes.io/projected/87c95d54-8750-4daf-a981-9544bc6b1fc7-kube-api-access-7m7fz\") pod \"87c95d54-8750-4daf-a981-9544bc6b1fc7\" (UID: \"87c95d54-8750-4daf-a981-9544bc6b1fc7\") " Mar 19 09:52:18.854453 master-0 kubenswrapper[15202]: I0319 09:52:18.852619 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c95d54-8750-4daf-a981-9544bc6b1fc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87c95d54-8750-4daf-a981-9544bc6b1fc7" (UID: "87c95d54-8750-4daf-a981-9544bc6b1fc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:18.854453 master-0 kubenswrapper[15202]: I0319 09:52:18.852648 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcea5022-9090-4c15-8a38-91e20e844584-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcea5022-9090-4c15-8a38-91e20e844584" (UID: "fcea5022-9090-4c15-8a38-91e20e844584"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:18.854453 master-0 kubenswrapper[15202]: I0319 09:52:18.853614 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcea5022-9090-4c15-8a38-91e20e844584-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:18.854453 master-0 kubenswrapper[15202]: I0319 09:52:18.853644 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c95d54-8750-4daf-a981-9544bc6b1fc7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:18.855534 master-0 kubenswrapper[15202]: I0319 09:52:18.855496 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcea5022-9090-4c15-8a38-91e20e844584-kube-api-access-tt9qt" (OuterVolumeSpecName: "kube-api-access-tt9qt") pod "fcea5022-9090-4c15-8a38-91e20e844584" (UID: "fcea5022-9090-4c15-8a38-91e20e844584"). InnerVolumeSpecName "kube-api-access-tt9qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:18.856321 master-0 kubenswrapper[15202]: I0319 09:52:18.856270 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c95d54-8750-4daf-a981-9544bc6b1fc7-kube-api-access-7m7fz" (OuterVolumeSpecName: "kube-api-access-7m7fz") pod "87c95d54-8750-4daf-a981-9544bc6b1fc7" (UID: "87c95d54-8750-4daf-a981-9544bc6b1fc7"). InnerVolumeSpecName "kube-api-access-7m7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:18.956105 master-0 kubenswrapper[15202]: I0319 09:52:18.956043 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m7fz\" (UniqueName: \"kubernetes.io/projected/87c95d54-8750-4daf-a981-9544bc6b1fc7-kube-api-access-7m7fz\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:18.956394 master-0 kubenswrapper[15202]: I0319 09:52:18.956382 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt9qt\" (UniqueName: \"kubernetes.io/projected/fcea5022-9090-4c15-8a38-91e20e844584-kube-api-access-tt9qt\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:19.191188 master-0 kubenswrapper[15202]: I0319 09:52:19.191123 15202 generic.go:334] "Generic (PLEG): container finished" podID="d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" containerID="4018b0ae4fb460db1cbc894596cf06ad4ce784a13810c83cbfd3e89536b62b70" exitCode=0 Mar 19 09:52:19.191427 master-0 kubenswrapper[15202]: I0319 09:52:19.191224 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jshcc" event={"ID":"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db","Type":"ContainerDied","Data":"4018b0ae4fb460db1cbc894596cf06ad4ce784a13810c83cbfd3e89536b62b70"} Mar 19 09:52:19.193865 master-0 kubenswrapper[15202]: I0319 09:52:19.193831 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8395-account-create-update-8nzrz" event={"ID":"87c95d54-8750-4daf-a981-9544bc6b1fc7","Type":"ContainerDied","Data":"bc7f492916019ee85d515404beb4bad79222f3d3bd1365adf863ffde097d7ebc"} Mar 19 09:52:19.193950 master-0 kubenswrapper[15202]: I0319 09:52:19.193867 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7f492916019ee85d515404beb4bad79222f3d3bd1365adf863ffde097d7ebc" Mar 19 09:52:19.193950 master-0 kubenswrapper[15202]: I0319 09:52:19.193915 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8395-account-create-update-8nzrz" Mar 19 09:52:19.198624 master-0 kubenswrapper[15202]: I0319 09:52:19.198574 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zlb6q" event={"ID":"fcea5022-9090-4c15-8a38-91e20e844584","Type":"ContainerDied","Data":"a8381bfa2b5d305860e9a681428c2200be72b2ee51a68530feb2c2531cde73f9"} Mar 19 09:52:19.198624 master-0 kubenswrapper[15202]: I0319 09:52:19.198611 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zlb6q" Mar 19 09:52:19.198624 master-0 kubenswrapper[15202]: I0319 09:52:19.198629 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8381bfa2b5d305860e9a681428c2200be72b2ee51a68530feb2c2531cde73f9" Mar 19 09:52:19.397213 master-0 kubenswrapper[15202]: I0319 09:52:19.397094 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h86v5\" (UniqueName: \"kubernetes.io/projected/6898ed5b-562b-415f-93f6-ddf0c1e01558-kube-api-access-h86v5\") pod \"nova-cell0-ea37-account-create-update-c8nf7\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:19.563610 master-0 kubenswrapper[15202]: I0319 09:52:19.554512 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:19.766612 master-0 kubenswrapper[15202]: I0319 09:52:19.766538 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.900662 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5ec6-account-create-update-fn8fv"] Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: E0319 09:52:19.901202 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcea5022-9090-4c15-8a38-91e20e844584" containerName="mariadb-database-create" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901217 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcea5022-9090-4c15-8a38-91e20e844584" containerName="mariadb-database-create" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: E0319 09:52:19.901229 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-log" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901235 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-log" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: E0319 09:52:19.901248 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c95d54-8750-4daf-a981-9544bc6b1fc7" containerName="mariadb-account-create-update" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901255 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c95d54-8750-4daf-a981-9544bc6b1fc7" containerName="mariadb-account-create-update" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: E0319 09:52:19.901290 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-httpd" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901297 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-httpd" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901505 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-httpd" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901536 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" containerName="glance-log" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901549 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c95d54-8750-4daf-a981-9544bc6b1fc7" containerName="mariadb-account-create-update" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901564 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcea5022-9090-4c15-8a38-91e20e844584" containerName="mariadb-database-create" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901457 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-config-data\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901723 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-scripts\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.901944 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.902016 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-public-tls-certs\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.902060 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-logs\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.902176 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-httpd-run\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.902254 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-combined-ca-bundle\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.902291 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:19.902585 master-0 kubenswrapper[15202]: I0319 09:52:19.902386 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94vk\" (UniqueName: \"kubernetes.io/projected/20a9e839-7eb3-4ba6-bc63-7220be59d238-kube-api-access-k94vk\") pod \"20a9e839-7eb3-4ba6-bc63-7220be59d238\" (UID: \"20a9e839-7eb3-4ba6-bc63-7220be59d238\") " Mar 19 09:52:19.906105 master-0 kubenswrapper[15202]: I0319 09:52:19.906030 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 09:52:19.907503 master-0 kubenswrapper[15202]: I0319 09:52:19.906337 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-logs" (OuterVolumeSpecName: "logs") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:19.907503 master-0 kubenswrapper[15202]: I0319 09:52:19.906601 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:19.913748 master-0 kubenswrapper[15202]: I0319 09:52:19.913664 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a9e839-7eb3-4ba6-bc63-7220be59d238-kube-api-access-k94vk" (OuterVolumeSpecName: "kube-api-access-k94vk") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "kube-api-access-k94vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:19.913889 master-0 kubenswrapper[15202]: I0319 09:52:19.913866 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-scripts" (OuterVolumeSpecName: "scripts") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:19.954134 master-0 kubenswrapper[15202]: I0319 09:52:19.953991 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688" (OuterVolumeSpecName: "glance") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:52:19.954352 master-0 kubenswrapper[15202]: I0319 09:52:19.954282 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:20.003922 master-0 kubenswrapper[15202]: I0319 09:52:20.003837 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:20.005596 master-0 kubenswrapper[15202]: I0319 09:52:20.005509 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zmq6\" (UniqueName: \"kubernetes.io/projected/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-kube-api-access-9zmq6\") pod \"nova-cell1-5ec6-account-create-update-fn8fv\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.005936 master-0 kubenswrapper[15202]: I0319 09:52:20.005837 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-operator-scripts\") pod \"nova-cell1-5ec6-account-create-update-fn8fv\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.005936 master-0 kubenswrapper[15202]: I0319 09:52:20.005932 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.006019 master-0 kubenswrapper[15202]: I0319 09:52:20.005946 15202 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20a9e839-7eb3-4ba6-bc63-7220be59d238-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.006019 master-0 kubenswrapper[15202]: I0319 09:52:20.005958 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.006019 master-0 kubenswrapper[15202]: I0319 09:52:20.005969 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94vk\" (UniqueName: \"kubernetes.io/projected/20a9e839-7eb3-4ba6-bc63-7220be59d238-kube-api-access-k94vk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.006019 master-0 kubenswrapper[15202]: I0319 09:52:20.005977 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.006164 master-0 kubenswrapper[15202]: I0319 09:52:20.006044 15202 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") on node \"master-0\" " Mar 19 09:52:20.006164 master-0 kubenswrapper[15202]: I0319 09:52:20.006056 15202 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.038567 master-0 kubenswrapper[15202]: I0319 09:52:20.037604 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-config-data" (OuterVolumeSpecName: "config-data") pod "20a9e839-7eb3-4ba6-bc63-7220be59d238" (UID: "20a9e839-7eb3-4ba6-bc63-7220be59d238"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:20.047813 master-0 kubenswrapper[15202]: I0319 09:52:20.047653 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687479ff9d-8shw8"] Mar 19 09:52:20.048126 master-0 kubenswrapper[15202]: I0319 09:52:20.048033 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-687479ff9d-8shw8" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-log" containerID="cri-o://037976b1a5e8e92d16755532488ffdbebd0e4c908e4d2426cb213e35e9515dcf" gracePeriod=30 Mar 19 09:52:20.048239 master-0 kubenswrapper[15202]: I0319 09:52:20.048205 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-687479ff9d-8shw8" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-api" containerID="cri-o://9f93692bac9e51d66d621ba05b2a00361d12e9e51fa9284ff9722c9c29dd9a2f" gracePeriod=30 Mar 19 09:52:20.059548 master-0 kubenswrapper[15202]: I0319 09:52:20.058385 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5ec6-account-create-update-fn8fv"] Mar 19 09:52:20.067411 master-0 kubenswrapper[15202]: I0319 09:52:20.067320 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-zlb6q" podStartSLOduration=10.067296244 podStartE2EDuration="10.067296244s" podCreationTimestamp="2026-03-19 09:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:20.027164516 +0000 UTC m=+1657.412579342" watchObservedRunningTime="2026-03-19 09:52:20.067296244 +0000 UTC m=+1657.452711050" Mar 19 09:52:20.086365 master-0 kubenswrapper[15202]: I0319 09:52:20.086312 15202 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:52:20.086645 master-0 kubenswrapper[15202]: I0319 09:52:20.086598 15202 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f" (UniqueName: "kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688") on node "master-0" Mar 19 09:52:20.108423 master-0 kubenswrapper[15202]: I0319 09:52:20.108312 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zmq6\" (UniqueName: \"kubernetes.io/projected/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-kube-api-access-9zmq6\") pod \"nova-cell1-5ec6-account-create-update-fn8fv\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.108622 master-0 kubenswrapper[15202]: I0319 09:52:20.108596 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-operator-scripts\") pod \"nova-cell1-5ec6-account-create-update-fn8fv\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.109231 master-0 kubenswrapper[15202]: I0319 09:52:20.108709 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9e839-7eb3-4ba6-bc63-7220be59d238-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.109231 master-0 kubenswrapper[15202]: I0319 09:52:20.108724 15202 reconciler_common.go:293] "Volume detached for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.109496 master-0 kubenswrapper[15202]: I0319 09:52:20.109456 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-operator-scripts\") pod \"nova-cell1-5ec6-account-create-update-fn8fv\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.218025 master-0 kubenswrapper[15202]: I0319 09:52:20.217964 15202 generic.go:334] "Generic (PLEG): container finished" podID="fa213423-98fc-446d-9208-33d884780995" containerID="037976b1a5e8e92d16755532488ffdbebd0e4c908e4d2426cb213e35e9515dcf" exitCode=143 Mar 19 09:52:20.218296 master-0 kubenswrapper[15202]: I0319 09:52:20.218044 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687479ff9d-8shw8" event={"ID":"fa213423-98fc-446d-9208-33d884780995","Type":"ContainerDied","Data":"037976b1a5e8e92d16755532488ffdbebd0e4c908e4d2426cb213e35e9515dcf"} Mar 19 09:52:20.220340 master-0 kubenswrapper[15202]: I0319 09:52:20.220295 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:20.220412 master-0 kubenswrapper[15202]: I0319 09:52:20.220362 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"20a9e839-7eb3-4ba6-bc63-7220be59d238","Type":"ContainerDied","Data":"ecce5544b338800bfb03c9c629211eb5feb9d9a1715988c3650175f18c5ef6bc"} Mar 19 09:52:20.220495 master-0 kubenswrapper[15202]: I0319 09:52:20.220406 15202 scope.go:117] "RemoveContainer" containerID="369ad3b9f0e4ef9c96b7b0fdf9c622246a6c2506dd02977cc69001dd6fe027e7" Mar 19 09:52:20.258672 master-0 kubenswrapper[15202]: I0319 09:52:20.258609 15202 scope.go:117] "RemoveContainer" containerID="66072e97e8c03addc037a16bdd832c64e2ef534fcdab06a5df8ead3d9c8d05fb" Mar 19 09:52:20.330578 master-0 kubenswrapper[15202]: I0319 09:52:20.322334 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zmq6\" (UniqueName: \"kubernetes.io/projected/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-kube-api-access-9zmq6\") pod \"nova-cell1-5ec6-account-create-update-fn8fv\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.343801 master-0 kubenswrapper[15202]: I0319 09:52:20.343724 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:20.700711 master-0 kubenswrapper[15202]: I0319 09:52:20.700591 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:20.794847 master-0 kubenswrapper[15202]: W0319 09:52:20.794786 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6898ed5b_562b_415f_93f6_ddf0c1e01558.slice/crio-321c20bf5b91cf80e7ec96782125637e36e7449ca88af9bdb397b139d68817ae WatchSource:0}: Error finding container 321c20bf5b91cf80e7ec96782125637e36e7449ca88af9bdb397b139d68817ae: Status 404 returned error can't find the container with id 321c20bf5b91cf80e7ec96782125637e36e7449ca88af9bdb397b139d68817ae Mar 19 09:52:20.806481 master-0 kubenswrapper[15202]: I0319 09:52:20.806377 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea37-account-create-update-c8nf7"] Mar 19 09:52:20.830023 master-0 kubenswrapper[15202]: I0319 09:52:20.829979 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svrc9\" (UniqueName: \"kubernetes.io/projected/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-kube-api-access-svrc9\") pod \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " Mar 19 09:52:20.830539 master-0 kubenswrapper[15202]: I0319 09:52:20.830520 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-operator-scripts\") pod \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\" (UID: \"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db\") " Mar 19 09:52:20.830991 master-0 kubenswrapper[15202]: I0319 09:52:20.830948 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" (UID: "d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:20.831744 master-0 kubenswrapper[15202]: I0319 09:52:20.831726 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:20.834679 master-0 kubenswrapper[15202]: I0319 09:52:20.834633 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-kube-api-access-svrc9" (OuterVolumeSpecName: "kube-api-access-svrc9") pod "d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" (UID: "d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db"). InnerVolumeSpecName "kube-api-access-svrc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:20.935375 master-0 kubenswrapper[15202]: I0319 09:52:20.935311 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svrc9\" (UniqueName: \"kubernetes.io/projected/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db-kube-api-access-svrc9\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:21.240566 master-0 kubenswrapper[15202]: I0319 09:52:21.239672 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" event={"ID":"6898ed5b-562b-415f-93f6-ddf0c1e01558","Type":"ContainerStarted","Data":"321c20bf5b91cf80e7ec96782125637e36e7449ca88af9bdb397b139d68817ae"} Mar 19 09:52:21.244712 master-0 kubenswrapper[15202]: I0319 09:52:21.244633 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jshcc" event={"ID":"d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db","Type":"ContainerDied","Data":"cf2d657708fee526b5a48d06d71611b574133e8a5aedcf646d11d554b05c92c0"} Mar 19 09:52:21.244712 master-0 kubenswrapper[15202]: I0319 09:52:21.244685 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf2d657708fee526b5a48d06d71611b574133e8a5aedcf646d11d554b05c92c0" Mar 19 09:52:21.244988 master-0 kubenswrapper[15202]: I0319 09:52:21.244966 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jshcc" Mar 19 09:52:22.277209 master-0 kubenswrapper[15202]: I0319 09:52:22.277148 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" event={"ID":"6898ed5b-562b-415f-93f6-ddf0c1e01558","Type":"ContainerStarted","Data":"59eab6d4d9d8948434477e98def35f77419843608afebc02befd0c1d78c60024"} Mar 19 09:52:22.834700 master-0 kubenswrapper[15202]: I0319 09:52:22.834171 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5ec6-account-create-update-fn8fv"] Mar 19 09:52:23.290515 master-0 kubenswrapper[15202]: I0319 09:52:23.289395 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" event={"ID":"f2142cc2-3e56-4ff5-b467-b79d4a99c56c","Type":"ContainerStarted","Data":"3c38d79e2f1fc8e907a019093506e8a3a69aa6aa80d90fa80be9a09b4cf958c2"} Mar 19 09:52:23.290515 master-0 kubenswrapper[15202]: I0319 09:52:23.289460 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" event={"ID":"f2142cc2-3e56-4ff5-b467-b79d4a99c56c","Type":"ContainerStarted","Data":"a515d4af244d2de8faa985ce67365120c12ccb7587fdd426913a9b5d903402cd"} Mar 19 09:52:24.304461 master-0 kubenswrapper[15202]: I0319 09:52:24.304285 15202 generic.go:334] "Generic (PLEG): container finished" podID="fa213423-98fc-446d-9208-33d884780995" containerID="9f93692bac9e51d66d621ba05b2a00361d12e9e51fa9284ff9722c9c29dd9a2f" exitCode=0 Mar 19 09:52:24.304461 master-0 kubenswrapper[15202]: I0319 09:52:24.304360 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687479ff9d-8shw8" event={"ID":"fa213423-98fc-446d-9208-33d884780995","Type":"ContainerDied","Data":"9f93692bac9e51d66d621ba05b2a00361d12e9e51fa9284ff9722c9c29dd9a2f"} Mar 19 09:52:24.309326 master-0 kubenswrapper[15202]: I0319 09:52:24.309283 15202 generic.go:334] "Generic (PLEG): container finished" podID="6898ed5b-562b-415f-93f6-ddf0c1e01558" containerID="59eab6d4d9d8948434477e98def35f77419843608afebc02befd0c1d78c60024" exitCode=0 Mar 19 09:52:24.310016 master-0 kubenswrapper[15202]: I0319 09:52:24.309958 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" event={"ID":"6898ed5b-562b-415f-93f6-ddf0c1e01558","Type":"ContainerDied","Data":"59eab6d4d9d8948434477e98def35f77419843608afebc02befd0c1d78c60024"} Mar 19 09:52:24.585292 master-0 kubenswrapper[15202]: I0319 09:52:24.585156 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:52:24.646252 master-0 kubenswrapper[15202]: I0319 09:52:24.646068 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg7jk\" (UniqueName: \"kubernetes.io/projected/fa213423-98fc-446d-9208-33d884780995-kube-api-access-mg7jk\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.646252 master-0 kubenswrapper[15202]: I0319 09:52:24.646120 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-combined-ca-bundle\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.646252 master-0 kubenswrapper[15202]: I0319 09:52:24.646176 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-config-data\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.646572 master-0 kubenswrapper[15202]: I0319 09:52:24.646276 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa213423-98fc-446d-9208-33d884780995-logs\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.646572 master-0 kubenswrapper[15202]: I0319 09:52:24.646385 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-public-tls-certs\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.646572 master-0 kubenswrapper[15202]: I0319 09:52:24.646508 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-scripts\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.646680 master-0 kubenswrapper[15202]: I0319 09:52:24.646652 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-internal-tls-certs\") pod \"fa213423-98fc-446d-9208-33d884780995\" (UID: \"fa213423-98fc-446d-9208-33d884780995\") " Mar 19 09:52:24.647514 master-0 kubenswrapper[15202]: I0319 09:52:24.646831 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa213423-98fc-446d-9208-33d884780995-logs" (OuterVolumeSpecName: "logs") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:24.647514 master-0 kubenswrapper[15202]: I0319 09:52:24.647265 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa213423-98fc-446d-9208-33d884780995-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:24.658249 master-0 kubenswrapper[15202]: I0319 09:52:24.658157 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-scripts" (OuterVolumeSpecName: "scripts") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24.661517 master-0 kubenswrapper[15202]: I0319 09:52:24.659891 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa213423-98fc-446d-9208-33d884780995-kube-api-access-mg7jk" (OuterVolumeSpecName: "kube-api-access-mg7jk") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "kube-api-access-mg7jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:24.706493 master-0 kubenswrapper[15202]: I0319 09:52:24.706382 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24.733632 master-0 kubenswrapper[15202]: I0319 09:52:24.733459 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-config-data" (OuterVolumeSpecName: "config-data") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24.750433 master-0 kubenswrapper[15202]: I0319 09:52:24.749163 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:24.750433 master-0 kubenswrapper[15202]: I0319 09:52:24.749219 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg7jk\" (UniqueName: \"kubernetes.io/projected/fa213423-98fc-446d-9208-33d884780995-kube-api-access-mg7jk\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:24.750433 master-0 kubenswrapper[15202]: I0319 09:52:24.749233 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:24.750433 master-0 kubenswrapper[15202]: I0319 09:52:24.749242 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:24.776685 master-0 kubenswrapper[15202]: I0319 09:52:24.776215 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24.791243 master-0 kubenswrapper[15202]: I0319 09:52:24.791149 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa213423-98fc-446d-9208-33d884780995" (UID: "fa213423-98fc-446d-9208-33d884780995"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24.852056 master-0 kubenswrapper[15202]: I0319 09:52:24.851968 15202 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:24.852056 master-0 kubenswrapper[15202]: I0319 09:52:24.852033 15202 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa213423-98fc-446d-9208-33d884780995-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:25.329514 master-0 kubenswrapper[15202]: I0319 09:52:25.328214 15202 generic.go:334] "Generic (PLEG): container finished" podID="f2142cc2-3e56-4ff5-b467-b79d4a99c56c" containerID="3c38d79e2f1fc8e907a019093506e8a3a69aa6aa80d90fa80be9a09b4cf958c2" exitCode=0 Mar 19 09:52:25.329514 master-0 kubenswrapper[15202]: I0319 09:52:25.328287 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" event={"ID":"f2142cc2-3e56-4ff5-b467-b79d4a99c56c","Type":"ContainerDied","Data":"3c38d79e2f1fc8e907a019093506e8a3a69aa6aa80d90fa80be9a09b4cf958c2"} Mar 19 09:52:25.331525 master-0 kubenswrapper[15202]: I0319 09:52:25.330765 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687479ff9d-8shw8" Mar 19 09:52:25.331796 master-0 kubenswrapper[15202]: I0319 09:52:25.331747 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687479ff9d-8shw8" event={"ID":"fa213423-98fc-446d-9208-33d884780995","Type":"ContainerDied","Data":"c856abfd7d7575ad1d1bd188cd9d13814b43bf73d84a00d443de688c4ef458ff"} Mar 19 09:52:25.331796 master-0 kubenswrapper[15202]: I0319 09:52:25.331774 15202 scope.go:117] "RemoveContainer" containerID="9f93692bac9e51d66d621ba05b2a00361d12e9e51fa9284ff9722c9c29dd9a2f" Mar 19 09:52:25.405505 master-0 kubenswrapper[15202]: I0319 09:52:25.395907 15202 scope.go:117] "RemoveContainer" containerID="037976b1a5e8e92d16755532488ffdbebd0e4c908e4d2426cb213e35e9515dcf" Mar 19 09:52:25.707496 master-0 kubenswrapper[15202]: I0319 09:52:25.707419 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:25.868679 master-0 kubenswrapper[15202]: I0319 09:52:25.868562 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:26.353867 master-0 kubenswrapper[15202]: I0319 09:52:26.353421 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" event={"ID":"6898ed5b-562b-415f-93f6-ddf0c1e01558","Type":"ContainerDied","Data":"321c20bf5b91cf80e7ec96782125637e36e7449ca88af9bdb397b139d68817ae"} Mar 19 09:52:26.353867 master-0 kubenswrapper[15202]: I0319 09:52:26.353520 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="321c20bf5b91cf80e7ec96782125637e36e7449ca88af9bdb397b139d68817ae" Mar 19 09:52:26.353867 master-0 kubenswrapper[15202]: I0319 09:52:26.353450 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" Mar 19 09:52:26.357672 master-0 kubenswrapper[15202]: I0319 09:52:26.357623 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"496c49f4-9bde-41e5-ab83-477abcf1c5ef","Type":"ContainerDied","Data":"8cd4a37f5217605fe5838493834b38280a37aebc751efd2af6bd248cd4427f3a"} Mar 19 09:52:26.357760 master-0 kubenswrapper[15202]: I0319 09:52:26.357686 15202 scope.go:117] "RemoveContainer" containerID="df5424396812996687c783a3b69bbe79ddcfc74205891ffc7ddd501a9b5f7d01" Mar 19 09:52:26.357842 master-0 kubenswrapper[15202]: I0319 09:52:26.357820 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.388399 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-scripts\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.388784 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6898ed5b-562b-415f-93f6-ddf0c1e01558-operator-scripts\") pod \"6898ed5b-562b-415f-93f6-ddf0c1e01558\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.388830 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-internal-tls-certs\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.388870 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h86v5\" (UniqueName: \"kubernetes.io/projected/6898ed5b-562b-415f-93f6-ddf0c1e01558-kube-api-access-h86v5\") pod \"6898ed5b-562b-415f-93f6-ddf0c1e01558\" (UID: \"6898ed5b-562b-415f-93f6-ddf0c1e01558\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.388905 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-combined-ca-bundle\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.388934 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6crtb\" (UniqueName: \"kubernetes.io/projected/496c49f4-9bde-41e5-ab83-477abcf1c5ef-kube-api-access-6crtb\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.389122 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.389209 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-config-data\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.389273 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-httpd-run\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.389377 master-0 kubenswrapper[15202]: I0319 09:52:26.389302 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-logs\") pod \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\" (UID: \"496c49f4-9bde-41e5-ab83-477abcf1c5ef\") " Mar 19 09:52:26.392003 master-0 kubenswrapper[15202]: I0319 09:52:26.391953 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:26.392379 master-0 kubenswrapper[15202]: I0319 09:52:26.392346 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-logs" (OuterVolumeSpecName: "logs") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:26.392562 master-0 kubenswrapper[15202]: I0319 09:52:26.392494 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6898ed5b-562b-415f-93f6-ddf0c1e01558-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6898ed5b-562b-415f-93f6-ddf0c1e01558" (UID: "6898ed5b-562b-415f-93f6-ddf0c1e01558"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:26.393404 master-0 kubenswrapper[15202]: I0319 09:52:26.393325 15202 scope.go:117] "RemoveContainer" containerID="98854ebdc204abcd4cff2dac5ca4f7526aadfe02d73a0ff60ff32c560e6899db" Mar 19 09:52:26.396937 master-0 kubenswrapper[15202]: I0319 09:52:26.396896 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496c49f4-9bde-41e5-ab83-477abcf1c5ef-kube-api-access-6crtb" (OuterVolumeSpecName: "kube-api-access-6crtb") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "kube-api-access-6crtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:26.399805 master-0 kubenswrapper[15202]: I0319 09:52:26.399776 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6898ed5b-562b-415f-93f6-ddf0c1e01558-kube-api-access-h86v5" (OuterVolumeSpecName: "kube-api-access-h86v5") pod "6898ed5b-562b-415f-93f6-ddf0c1e01558" (UID: "6898ed5b-562b-415f-93f6-ddf0c1e01558"). InnerVolumeSpecName "kube-api-access-h86v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:26.400667 master-0 kubenswrapper[15202]: I0319 09:52:26.400619 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-scripts" (OuterVolumeSpecName: "scripts") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:26.426420 master-0 kubenswrapper[15202]: I0319 09:52:26.426355 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:26.430451 master-0 kubenswrapper[15202]: I0319 09:52:26.430394 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097" (OuterVolumeSpecName: "glance") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:52:26.484334 master-0 kubenswrapper[15202]: I0319 09:52:26.484202 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-config-data" (OuterVolumeSpecName: "config-data") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:26.487412 master-0 kubenswrapper[15202]: I0319 09:52:26.487295 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "496c49f4-9bde-41e5-ab83-477abcf1c5ef" (UID: "496c49f4-9bde-41e5-ab83-477abcf1c5ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493171 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6898ed5b-562b-415f-93f6-ddf0c1e01558-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493215 15202 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493228 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h86v5\" (UniqueName: \"kubernetes.io/projected/6898ed5b-562b-415f-93f6-ddf0c1e01558-kube-api-access-h86v5\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493240 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493251 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6crtb\" (UniqueName: \"kubernetes.io/projected/496c49f4-9bde-41e5-ab83-477abcf1c5ef-kube-api-access-6crtb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493294 15202 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") on node \"master-0\" " Mar 19 09:52:26.493296 master-0 kubenswrapper[15202]: I0319 09:52:26.493305 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.494002 master-0 kubenswrapper[15202]: I0319 09:52:26.493315 15202 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.494002 master-0 kubenswrapper[15202]: I0319 09:52:26.493325 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/496c49f4-9bde-41e5-ab83-477abcf1c5ef-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.494002 master-0 kubenswrapper[15202]: I0319 09:52:26.493334 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/496c49f4-9bde-41e5-ab83-477abcf1c5ef-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.549679 master-0 kubenswrapper[15202]: I0319 09:52:26.548955 15202 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:52:26.549679 master-0 kubenswrapper[15202]: I0319 09:52:26.549220 15202 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba" (UniqueName: "kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097") on node "master-0" Mar 19 09:52:26.595131 master-0 kubenswrapper[15202]: I0319 09:52:26.595033 15202 reconciler_common.go:293] "Volume detached for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.771375 master-0 kubenswrapper[15202]: I0319 09:52:26.771305 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:26.904615 master-0 kubenswrapper[15202]: I0319 09:52:26.904562 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zmq6\" (UniqueName: \"kubernetes.io/projected/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-kube-api-access-9zmq6\") pod \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " Mar 19 09:52:26.904864 master-0 kubenswrapper[15202]: I0319 09:52:26.904840 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-operator-scripts\") pod \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\" (UID: \"f2142cc2-3e56-4ff5-b467-b79d4a99c56c\") " Mar 19 09:52:26.905596 master-0 kubenswrapper[15202]: I0319 09:52:26.905531 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2142cc2-3e56-4ff5-b467-b79d4a99c56c" (UID: "f2142cc2-3e56-4ff5-b467-b79d4a99c56c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:26.906423 master-0 kubenswrapper[15202]: I0319 09:52:26.906388 15202 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:26.907985 master-0 kubenswrapper[15202]: I0319 09:52:26.907940 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-kube-api-access-9zmq6" (OuterVolumeSpecName: "kube-api-access-9zmq6") pod "f2142cc2-3e56-4ff5-b467-b79d4a99c56c" (UID: "f2142cc2-3e56-4ff5-b467-b79d4a99c56c"). InnerVolumeSpecName "kube-api-access-9zmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:27.009178 master-0 kubenswrapper[15202]: I0319 09:52:27.009114 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zmq6\" (UniqueName: \"kubernetes.io/projected/f2142cc2-3e56-4ff5-b467-b79d4a99c56c-kube-api-access-9zmq6\") on node \"master-0\" DevicePath \"\"" Mar 19 09:52:27.091017 master-0 kubenswrapper[15202]: I0319 09:52:27.090949 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:52:27.260229 master-0 kubenswrapper[15202]: I0319 09:52:27.260075 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:52:27.381574 master-0 kubenswrapper[15202]: I0319 09:52:27.381488 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" event={"ID":"f2142cc2-3e56-4ff5-b467-b79d4a99c56c","Type":"ContainerDied","Data":"a515d4af244d2de8faa985ce67365120c12ccb7587fdd426913a9b5d903402cd"} Mar 19 09:52:27.381574 master-0 kubenswrapper[15202]: I0319 09:52:27.381544 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a515d4af244d2de8faa985ce67365120c12ccb7587fdd426913a9b5d903402cd" Mar 19 09:52:27.382159 master-0 kubenswrapper[15202]: I0319 09:52:27.381601 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5ec6-account-create-update-fn8fv" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.005663 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006163 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" containerName="mariadb-database-create" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006177 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" containerName="mariadb-database-create" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006197 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-log" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006205 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-log" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006224 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-log" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006232 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-log" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006252 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2142cc2-3e56-4ff5-b467-b79d4a99c56c" containerName="mariadb-account-create-update" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006260 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2142cc2-3e56-4ff5-b467-b79d4a99c56c" containerName="mariadb-account-create-update" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006270 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6898ed5b-562b-415f-93f6-ddf0c1e01558" containerName="mariadb-account-create-update" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006278 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="6898ed5b-562b-415f-93f6-ddf0c1e01558" containerName="mariadb-account-create-update" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006308 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-api" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006314 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-api" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: E0319 09:52:28.006325 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-httpd" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006331 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-httpd" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006557 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-log" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006577 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-httpd" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006598 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" containerName="mariadb-database-create" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006610 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2142cc2-3e56-4ff5-b467-b79d4a99c56c" containerName="mariadb-account-create-update" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006623 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" containerName="glance-log" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006639 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa213423-98fc-446d-9208-33d884780995" containerName="placement-api" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.006650 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="6898ed5b-562b-415f-93f6-ddf0c1e01558" containerName="mariadb-account-create-update" Mar 19 09:52:28.008502 master-0 kubenswrapper[15202]: I0319 09:52:28.007792 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.010120 master-0 kubenswrapper[15202]: I0319 09:52:28.010043 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 09:52:28.010427 master-0 kubenswrapper[15202]: I0319 09:52:28.010223 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-default-external-config-data" Mar 19 09:52:28.010427 master-0 kubenswrapper[15202]: I0319 09:52:28.010344 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 09:52:28.347229 master-0 kubenswrapper[15202]: I0319 09:52:28.347059 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-ea37-account-create-update-c8nf7" podStartSLOduration=14.347038255 podStartE2EDuration="14.347038255s" podCreationTimestamp="2026-03-19 09:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:28.312274498 +0000 UTC m=+1665.697689324" watchObservedRunningTime="2026-03-19 09:52:28.347038255 +0000 UTC m=+1665.732453071" Mar 19 09:52:28.350401 master-0 kubenswrapper[15202]: I0319 09:52:28.350342 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:52:28.359899 master-0 kubenswrapper[15202]: I0319 09:52:28.359835 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmzj\" (UniqueName: \"kubernetes.io/projected/c8e50d67-c919-4e31-a98d-882b87a58541-kube-api-access-8nmzj\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.360261 master-0 kubenswrapper[15202]: I0319 09:52:28.360243 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.360600 master-0 kubenswrapper[15202]: I0319 09:52:28.360583 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.361241 master-0 kubenswrapper[15202]: I0319 09:52:28.360704 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8e50d67-c919-4e31-a98d-882b87a58541-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.361384 master-0 kubenswrapper[15202]: I0319 09:52:28.361367 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.361495 master-0 kubenswrapper[15202]: I0319 09:52:28.361461 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.361643 master-0 kubenswrapper[15202]: I0319 09:52:28.361627 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e50d67-c919-4e31-a98d-882b87a58541-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.361860 master-0 kubenswrapper[15202]: I0319 09:52:28.361781 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464156 master-0 kubenswrapper[15202]: I0319 09:52:28.464072 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmzj\" (UniqueName: \"kubernetes.io/projected/c8e50d67-c919-4e31-a98d-882b87a58541-kube-api-access-8nmzj\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464860 master-0 kubenswrapper[15202]: I0319 09:52:28.464157 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464860 master-0 kubenswrapper[15202]: I0319 09:52:28.464230 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8e50d67-c919-4e31-a98d-882b87a58541-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464860 master-0 kubenswrapper[15202]: I0319 09:52:28.464272 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464860 master-0 kubenswrapper[15202]: I0319 09:52:28.464299 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464860 master-0 kubenswrapper[15202]: I0319 09:52:28.464333 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e50d67-c919-4e31-a98d-882b87a58541-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.464860 master-0 kubenswrapper[15202]: I0319 09:52:28.464377 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.465615 master-0 kubenswrapper[15202]: I0319 09:52:28.465550 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8e50d67-c919-4e31-a98d-882b87a58541-httpd-run\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.466030 master-0 kubenswrapper[15202]: I0319 09:52:28.465967 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e50d67-c919-4e31-a98d-882b87a58541-logs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.469110 master-0 kubenswrapper[15202]: I0319 09:52:28.468076 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-public-tls-certs\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.469110 master-0 kubenswrapper[15202]: I0319 09:52:28.468514 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-scripts\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.470976 master-0 kubenswrapper[15202]: I0319 09:52:28.470940 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-config-data\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.471185 master-0 kubenswrapper[15202]: I0319 09:52:28.471130 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e50d67-c919-4e31-a98d-882b87a58541-combined-ca-bundle\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.834854 master-0 kubenswrapper[15202]: I0319 09:52:28.834801 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmzj\" (UniqueName: \"kubernetes.io/projected/c8e50d67-c919-4e31-a98d-882b87a58541-kube-api-access-8nmzj\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.836153 master-0 kubenswrapper[15202]: I0319 09:52:28.836096 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a9e839-7eb3-4ba6-bc63-7220be59d238" path="/var/lib/kubelet/pods/20a9e839-7eb3-4ba6-bc63-7220be59d238/volumes" Mar 19 09:52:28.874321 master-0 kubenswrapper[15202]: I0319 09:52:28.874239 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:28.876414 master-0 kubenswrapper[15202]: I0319 09:52:28.876332 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:52:28.876414 master-0 kubenswrapper[15202]: I0319 09:52:28.876374 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/761159666a2fdb4e1d7229cd039b70780d5ca1904241b607263d6bd54bcba60c/globalmount\"" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:30.133428 master-0 kubenswrapper[15202]: I0319 09:52:29.783808 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-430873fc-8a8f-4afc-91e0-5a0e7c55256f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^580dc7b0-8ed9-4c3c-b55f-8353e8cbc688\") pod \"glance-3a5fd-default-external-api-0\" (UID: \"c8e50d67-c919-4e31-a98d-882b87a58541\") " pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:30.427345 master-0 kubenswrapper[15202]: I0319 09:52:30.427180 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:30.714318 master-0 kubenswrapper[15202]: I0319 09:52:30.714253 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:52:31.198106 master-0 kubenswrapper[15202]: I0319 09:52:31.198020 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:52:32.268411 master-0 kubenswrapper[15202]: I0319 09:52:32.268322 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:52:32.272630 master-0 kubenswrapper[15202]: I0319 09:52:32.272461 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:32.275935 master-0 kubenswrapper[15202]: I0319 09:52:32.275881 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3a5fd-default-internal-config-data" Mar 19 09:52:32.276496 master-0 kubenswrapper[15202]: I0319 09:52:32.276458 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 09:52:32.792166 master-0 kubenswrapper[15202]: I0319 09:52:32.792084 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:52:32.830174 master-0 kubenswrapper[15202]: I0319 09:52:32.829345 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496c49f4-9bde-41e5-ab83-477abcf1c5ef" path="/var/lib/kubelet/pods/496c49f4-9bde-41e5-ab83-477abcf1c5ef/volumes" Mar 19 09:52:33.204096 master-0 kubenswrapper[15202]: I0319 09:52:33.203996 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7lw2\" (UniqueName: \"kubernetes.io/projected/fca47216-8f0d-4d96-b557-0f35c442eccb-kube-api-access-k7lw2\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.204527 master-0 kubenswrapper[15202]: I0319 09:52:33.204116 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.204527 master-0 kubenswrapper[15202]: I0319 09:52:33.204203 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.204527 master-0 kubenswrapper[15202]: I0319 09:52:33.204276 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.204527 master-0 kubenswrapper[15202]: I0319 09:52:33.204310 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.204527 master-0 kubenswrapper[15202]: I0319 09:52:33.204404 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fca47216-8f0d-4d96-b557-0f35c442eccb-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.204527 master-0 kubenswrapper[15202]: I0319 09:52:33.204438 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.213509 master-0 kubenswrapper[15202]: I0319 09:52:33.212554 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca47216-8f0d-4d96-b557-0f35c442eccb-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.272945 master-0 kubenswrapper[15202]: I0319 09:52:33.272858 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687479ff9d-8shw8"] Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.315944 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7lw2\" (UniqueName: \"kubernetes.io/projected/fca47216-8f0d-4d96-b557-0f35c442eccb-kube-api-access-k7lw2\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.316064 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.316144 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.316201 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.316278 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fca47216-8f0d-4d96-b557-0f35c442eccb-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.316310 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.316492 master-0 kubenswrapper[15202]: I0319 09:52:33.316347 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca47216-8f0d-4d96-b557-0f35c442eccb-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.320485 master-0 kubenswrapper[15202]: I0319 09:52:33.317047 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fca47216-8f0d-4d96-b557-0f35c442eccb-logs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.321084 master-0 kubenswrapper[15202]: I0319 09:52:33.321043 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fca47216-8f0d-4d96-b557-0f35c442eccb-httpd-run\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.327487 master-0 kubenswrapper[15202]: I0319 09:52:33.323906 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-internal-tls-certs\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.327694 master-0 kubenswrapper[15202]: I0319 09:52:33.327594 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-combined-ca-bundle\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.331495 master-0 kubenswrapper[15202]: I0319 09:52:33.329319 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-config-data\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.341489 master-0 kubenswrapper[15202]: I0319 09:52:33.338295 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fca47216-8f0d-4d96-b557-0f35c442eccb-scripts\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.704518 master-0 kubenswrapper[15202]: I0319 09:52:33.704406 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-687479ff9d-8shw8"] Mar 19 09:52:33.828296 master-0 kubenswrapper[15202]: I0319 09:52:33.828186 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:33.832933 master-0 kubenswrapper[15202]: I0319 09:52:33.832879 15202 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:52:33.833055 master-0 kubenswrapper[15202]: I0319 09:52:33.832964 15202 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4408f40dfb1603f9af45ac684ff95accfb01fbfdee7d66269d29c583d6626950/globalmount\"" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:34.125169 master-0 kubenswrapper[15202]: I0319 09:52:34.125057 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7lw2\" (UniqueName: \"kubernetes.io/projected/fca47216-8f0d-4d96-b557-0f35c442eccb-kube-api-access-k7lw2\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:34.831330 master-0 kubenswrapper[15202]: I0319 09:52:34.831248 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa213423-98fc-446d-9208-33d884780995" path="/var/lib/kubelet/pods/fa213423-98fc-446d-9208-33d884780995/volumes" Mar 19 09:52:34.993946 master-0 kubenswrapper[15202]: I0319 09:52:34.993875 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efcc7bfe-2396-4399-97dc-5dbf9ab97eba\" (UniqueName: \"kubernetes.io/csi/topolvm.io^77a30552-7aa1-499c-a568-1687eaffc097\") pod \"glance-3a5fd-default-internal-api-0\" (UID: \"fca47216-8f0d-4d96-b557-0f35c442eccb\") " pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:35.302928 master-0 kubenswrapper[15202]: I0319 09:52:35.302843 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:36.176275 master-0 kubenswrapper[15202]: I0319 09:52:36.176216 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-external-api-0"] Mar 19 09:52:36.508893 master-0 kubenswrapper[15202]: I0319 09:52:36.508830 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"c8e50d67-c919-4e31-a98d-882b87a58541","Type":"ContainerStarted","Data":"671eb07510ab1a505214c1de7716bf0caea48d206798c00d6004730d06ea7aa0"} Mar 19 09:52:38.667527 master-0 kubenswrapper[15202]: I0319 09:52:38.667441 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3a5fd-default-internal-api-0"] Mar 19 09:52:38.679823 master-0 kubenswrapper[15202]: W0319 09:52:38.679728 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfca47216_8f0d_4d96_b557_0f35c442eccb.slice/crio-19b2673e8667a5bd9b9d59d74e0d0764396480877368d724cc84c298a76a32af WatchSource:0}: Error finding container 19b2673e8667a5bd9b9d59d74e0d0764396480877368d724cc84c298a76a32af: Status 404 returned error can't find the container with id 19b2673e8667a5bd9b9d59d74e0d0764396480877368d724cc84c298a76a32af Mar 19 09:52:39.553631 master-0 kubenswrapper[15202]: I0319 09:52:39.552575 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"fca47216-8f0d-4d96-b557-0f35c442eccb","Type":"ContainerStarted","Data":"19b2673e8667a5bd9b9d59d74e0d0764396480877368d724cc84c298a76a32af"} Mar 19 09:52:40.234277 master-0 kubenswrapper[15202]: I0319 09:52:40.234200 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x9mns"] Mar 19 09:52:40.238246 master-0 kubenswrapper[15202]: I0319 09:52:40.236210 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.240371 master-0 kubenswrapper[15202]: I0319 09:52:40.240300 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 09:52:40.240676 master-0 kubenswrapper[15202]: I0319 09:52:40.240617 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 09:52:40.259093 master-0 kubenswrapper[15202]: I0319 09:52:40.259028 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x9mns"] Mar 19 09:52:40.327045 master-0 kubenswrapper[15202]: I0319 09:52:40.326971 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.327317 master-0 kubenswrapper[15202]: I0319 09:52:40.327173 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-scripts\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.327317 master-0 kubenswrapper[15202]: I0319 09:52:40.327226 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-config-data\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.327425 master-0 kubenswrapper[15202]: I0319 09:52:40.327325 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls84s\" (UniqueName: \"kubernetes.io/projected/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-kube-api-access-ls84s\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.431565 master-0 kubenswrapper[15202]: I0319 09:52:40.430628 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-scripts\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.431565 master-0 kubenswrapper[15202]: I0319 09:52:40.430700 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-config-data\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.432366 master-0 kubenswrapper[15202]: I0319 09:52:40.432322 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls84s\" (UniqueName: \"kubernetes.io/projected/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-kube-api-access-ls84s\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.432641 master-0 kubenswrapper[15202]: I0319 09:52:40.432608 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.434800 master-0 kubenswrapper[15202]: I0319 09:52:40.434744 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-scripts\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.437406 master-0 kubenswrapper[15202]: I0319 09:52:40.437369 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.439767 master-0 kubenswrapper[15202]: I0319 09:52:40.439212 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-config-data\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.455064 master-0 kubenswrapper[15202]: I0319 09:52:40.453920 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls84s\" (UniqueName: \"kubernetes.io/projected/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-kube-api-access-ls84s\") pod \"nova-cell0-conductor-db-sync-x9mns\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:40.618957 master-0 kubenswrapper[15202]: I0319 09:52:40.618883 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:52:41.579204 master-0 kubenswrapper[15202]: I0319 09:52:41.579046 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"fca47216-8f0d-4d96-b557-0f35c442eccb","Type":"ContainerStarted","Data":"2f82c97fa45793fed8f5a243a629fbf679efc79bddaac4d5f5cccc6039c89ef7"} Mar 19 09:52:41.580986 master-0 kubenswrapper[15202]: I0319 09:52:41.580942 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"c8e50d67-c919-4e31-a98d-882b87a58541","Type":"ContainerStarted","Data":"1812aa543cde474bed9dc477950ed7e6f0322873b266c4123b84c57f37223e64"} Mar 19 09:52:41.666501 master-0 kubenswrapper[15202]: I0319 09:52:41.666226 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x9mns"] Mar 19 09:52:42.603429 master-0 kubenswrapper[15202]: I0319 09:52:42.603134 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-internal-api-0" event={"ID":"fca47216-8f0d-4d96-b557-0f35c442eccb","Type":"ContainerStarted","Data":"484b3b56cdc741a8322bc7bbc9ed27ff797ff357c2d174c9a2765bc3c8770bd5"} Mar 19 09:52:42.617237 master-0 kubenswrapper[15202]: I0319 09:52:42.609874 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x9mns" event={"ID":"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6","Type":"ContainerStarted","Data":"91eea55580ff9aec5f87309803efaa0ccd32c1e206152df4f373d069be6a8872"} Mar 19 09:52:42.644522 master-0 kubenswrapper[15202]: I0319 09:52:42.637626 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3a5fd-default-external-api-0" event={"ID":"c8e50d67-c919-4e31-a98d-882b87a58541","Type":"ContainerStarted","Data":"f78eb8e4c11adfbaa098fbe6f958be312830995293b675b1cde1f39e70b8bdc9"} Mar 19 09:52:42.651489 master-0 kubenswrapper[15202]: I0319 09:52:42.646528 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3a5fd-default-internal-api-0" podStartSLOduration=11.646506069 podStartE2EDuration="11.646506069s" podCreationTimestamp="2026-03-19 09:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:42.634227635 +0000 UTC m=+1680.019642451" watchObservedRunningTime="2026-03-19 09:52:42.646506069 +0000 UTC m=+1680.031920885" Mar 19 09:52:42.703948 master-0 kubenswrapper[15202]: I0319 09:52:42.703859 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3a5fd-default-external-api-0" podStartSLOduration=15.703839589 podStartE2EDuration="15.703839589s" podCreationTimestamp="2026-03-19 09:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:42.676040761 +0000 UTC m=+1680.061455597" watchObservedRunningTime="2026-03-19 09:52:42.703839589 +0000 UTC m=+1680.089254405" Mar 19 09:52:45.309916 master-0 kubenswrapper[15202]: I0319 09:52:45.308676 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:45.309916 master-0 kubenswrapper[15202]: I0319 09:52:45.308733 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:45.351422 master-0 kubenswrapper[15202]: I0319 09:52:45.351367 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:45.361805 master-0 kubenswrapper[15202]: I0319 09:52:45.361678 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:45.680567 master-0 kubenswrapper[15202]: I0319 09:52:45.680490 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:45.680567 master-0 kubenswrapper[15202]: I0319 09:52:45.680542 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:50.428493 master-0 kubenswrapper[15202]: I0319 09:52:50.428403 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:50.428493 master-0 kubenswrapper[15202]: I0319 09:52:50.428481 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:50.432965 master-0 kubenswrapper[15202]: I0319 09:52:50.431641 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:50.436595 master-0 kubenswrapper[15202]: I0319 09:52:50.436544 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-internal-api-0" Mar 19 09:52:50.460395 master-0 kubenswrapper[15202]: I0319 09:52:50.460319 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:50.478544 master-0 kubenswrapper[15202]: I0319 09:52:50.478452 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:50.751512 master-0 kubenswrapper[15202]: I0319 09:52:50.751378 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x9mns" event={"ID":"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6","Type":"ContainerStarted","Data":"26b548981a81bd303ac44e68210e5e48a84f86c169d173e4646aa80b4f6775c4"} Mar 19 09:52:50.751512 master-0 kubenswrapper[15202]: I0319 09:52:50.751428 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:50.752423 master-0 kubenswrapper[15202]: I0319 09:52:50.752323 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:51.777672 master-0 kubenswrapper[15202]: I0319 09:52:51.777570 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x9mns" podStartSLOduration=3.879170066 podStartE2EDuration="11.777549184s" podCreationTimestamp="2026-03-19 09:52:40 +0000 UTC" firstStartedPulling="2026-03-19 09:52:41.681287103 +0000 UTC m=+1679.066701919" lastFinishedPulling="2026-03-19 09:52:49.579666221 +0000 UTC m=+1686.965081037" observedRunningTime="2026-03-19 09:52:51.761517247 +0000 UTC m=+1689.146932083" watchObservedRunningTime="2026-03-19 09:52:51.777549184 +0000 UTC m=+1689.162964000" Mar 19 09:52:52.788118 master-0 kubenswrapper[15202]: I0319 09:52:52.788050 15202 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:52:52.788118 master-0 kubenswrapper[15202]: I0319 09:52:52.788105 15202 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:52:52.896251 master-0 kubenswrapper[15202]: I0319 09:52:52.896189 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:52:52.979086 master-0 kubenswrapper[15202]: I0319 09:52:52.979025 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3a5fd-default-external-api-0" Mar 19 09:53:06.987532 master-0 kubenswrapper[15202]: I0319 09:53:06.987421 15202 generic.go:334] "Generic (PLEG): container finished" podID="89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" containerID="26b548981a81bd303ac44e68210e5e48a84f86c169d173e4646aa80b4f6775c4" exitCode=0 Mar 19 09:53:06.987532 master-0 kubenswrapper[15202]: I0319 09:53:06.987505 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x9mns" event={"ID":"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6","Type":"ContainerDied","Data":"26b548981a81bd303ac44e68210e5e48a84f86c169d173e4646aa80b4f6775c4"} Mar 19 09:53:08.454403 master-0 kubenswrapper[15202]: I0319 09:53:08.454339 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:53:08.608808 master-0 kubenswrapper[15202]: I0319 09:53:08.608631 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-combined-ca-bundle\") pod \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " Mar 19 09:53:08.608808 master-0 kubenswrapper[15202]: I0319 09:53:08.608721 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-scripts\") pod \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " Mar 19 09:53:08.609107 master-0 kubenswrapper[15202]: I0319 09:53:08.608843 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-config-data\") pod \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " Mar 19 09:53:08.609261 master-0 kubenswrapper[15202]: I0319 09:53:08.609202 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls84s\" (UniqueName: \"kubernetes.io/projected/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-kube-api-access-ls84s\") pod \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\" (UID: \"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6\") " Mar 19 09:53:08.616680 master-0 kubenswrapper[15202]: I0319 09:53:08.616619 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-scripts" (OuterVolumeSpecName: "scripts") pod "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" (UID: "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:08.623968 master-0 kubenswrapper[15202]: I0319 09:53:08.623905 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-kube-api-access-ls84s" (OuterVolumeSpecName: "kube-api-access-ls84s") pod "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" (UID: "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6"). InnerVolumeSpecName "kube-api-access-ls84s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:08.649263 master-0 kubenswrapper[15202]: I0319 09:53:08.649065 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-config-data" (OuterVolumeSpecName: "config-data") pod "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" (UID: "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:08.660377 master-0 kubenswrapper[15202]: I0319 09:53:08.660310 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" (UID: "89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:08.713320 master-0 kubenswrapper[15202]: I0319 09:53:08.713246 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls84s\" (UniqueName: \"kubernetes.io/projected/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-kube-api-access-ls84s\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:08.713320 master-0 kubenswrapper[15202]: I0319 09:53:08.713317 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:08.713584 master-0 kubenswrapper[15202]: I0319 09:53:08.713340 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:08.713584 master-0 kubenswrapper[15202]: I0319 09:53:08.713357 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:09.031521 master-0 kubenswrapper[15202]: I0319 09:53:09.031435 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x9mns" event={"ID":"89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6","Type":"ContainerDied","Data":"91eea55580ff9aec5f87309803efaa0ccd32c1e206152df4f373d069be6a8872"} Mar 19 09:53:09.031521 master-0 kubenswrapper[15202]: I0319 09:53:09.031513 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91eea55580ff9aec5f87309803efaa0ccd32c1e206152df4f373d069be6a8872" Mar 19 09:53:09.031866 master-0 kubenswrapper[15202]: I0319 09:53:09.031541 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x9mns" Mar 19 09:53:09.193059 master-0 kubenswrapper[15202]: I0319 09:53:09.192704 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 09:53:09.193731 master-0 kubenswrapper[15202]: E0319 09:53:09.193601 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" containerName="nova-cell0-conductor-db-sync" Mar 19 09:53:09.193731 master-0 kubenswrapper[15202]: I0319 09:53:09.193635 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" containerName="nova-cell0-conductor-db-sync" Mar 19 09:53:09.194672 master-0 kubenswrapper[15202]: I0319 09:53:09.194001 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" containerName="nova-cell0-conductor-db-sync" Mar 19 09:53:09.195051 master-0 kubenswrapper[15202]: I0319 09:53:09.194994 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.200989 master-0 kubenswrapper[15202]: I0319 09:53:09.200878 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 09:53:09.218389 master-0 kubenswrapper[15202]: I0319 09:53:09.217716 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 09:53:09.226605 master-0 kubenswrapper[15202]: I0319 09:53:09.226568 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f6df2-3122-47bf-839d-bc6b737aa320-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.226704 master-0 kubenswrapper[15202]: I0319 09:53:09.226649 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f6df2-3122-47bf-839d-bc6b737aa320-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.227022 master-0 kubenswrapper[15202]: I0319 09:53:09.226897 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbt94\" (UniqueName: \"kubernetes.io/projected/801f6df2-3122-47bf-839d-bc6b737aa320-kube-api-access-bbt94\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.328538 master-0 kubenswrapper[15202]: I0319 09:53:09.328389 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f6df2-3122-47bf-839d-bc6b737aa320-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.328538 master-0 kubenswrapper[15202]: I0319 09:53:09.328499 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f6df2-3122-47bf-839d-bc6b737aa320-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.328799 master-0 kubenswrapper[15202]: I0319 09:53:09.328582 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbt94\" (UniqueName: \"kubernetes.io/projected/801f6df2-3122-47bf-839d-bc6b737aa320-kube-api-access-bbt94\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.334551 master-0 kubenswrapper[15202]: I0319 09:53:09.333723 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f6df2-3122-47bf-839d-bc6b737aa320-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.347005 master-0 kubenswrapper[15202]: I0319 09:53:09.346954 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbt94\" (UniqueName: \"kubernetes.io/projected/801f6df2-3122-47bf-839d-bc6b737aa320-kube-api-access-bbt94\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.349674 master-0 kubenswrapper[15202]: I0319 09:53:09.349634 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f6df2-3122-47bf-839d-bc6b737aa320-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"801f6df2-3122-47bf-839d-bc6b737aa320\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:09.531119 master-0 kubenswrapper[15202]: I0319 09:53:09.531009 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:10.005896 master-0 kubenswrapper[15202]: W0319 09:53:10.005684 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801f6df2_3122_47bf_839d_bc6b737aa320.slice/crio-509c3afc10a790a3558995d040e9f23dd0660c82351c6a9a108936ae3166c913 WatchSource:0}: Error finding container 509c3afc10a790a3558995d040e9f23dd0660c82351c6a9a108936ae3166c913: Status 404 returned error can't find the container with id 509c3afc10a790a3558995d040e9f23dd0660c82351c6a9a108936ae3166c913 Mar 19 09:53:10.006446 master-0 kubenswrapper[15202]: I0319 09:53:10.006407 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 09:53:10.044315 master-0 kubenswrapper[15202]: I0319 09:53:10.044248 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"801f6df2-3122-47bf-839d-bc6b737aa320","Type":"ContainerStarted","Data":"509c3afc10a790a3558995d040e9f23dd0660c82351c6a9a108936ae3166c913"} Mar 19 09:53:11.058771 master-0 kubenswrapper[15202]: I0319 09:53:11.058712 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"801f6df2-3122-47bf-839d-bc6b737aa320","Type":"ContainerStarted","Data":"c8bbae1d8cad8f3badf26c44b6b1cf877af5c320eed2dc0aa35ea1c36d78e4c8"} Mar 19 09:53:11.059757 master-0 kubenswrapper[15202]: I0319 09:53:11.059739 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:11.107212 master-0 kubenswrapper[15202]: I0319 09:53:11.107111 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.107089361 podStartE2EDuration="2.107089361s" podCreationTimestamp="2026-03-19 09:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:11.098929569 +0000 UTC m=+1708.484344395" watchObservedRunningTime="2026-03-19 09:53:11.107089361 +0000 UTC m=+1708.492504177" Mar 19 09:53:19.562631 master-0 kubenswrapper[15202]: I0319 09:53:19.562578 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 09:53:20.251617 master-0 kubenswrapper[15202]: I0319 09:53:20.251498 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-t8sfd"] Mar 19 09:53:20.254634 master-0 kubenswrapper[15202]: I0319 09:53:20.254567 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.264680 master-0 kubenswrapper[15202]: I0319 09:53:20.259009 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 09:53:20.264680 master-0 kubenswrapper[15202]: I0319 09:53:20.259281 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 09:53:20.289868 master-0 kubenswrapper[15202]: I0319 09:53:20.289766 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-config-data\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.290279 master-0 kubenswrapper[15202]: I0319 09:53:20.290255 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.290410 master-0 kubenswrapper[15202]: I0319 09:53:20.290392 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvqkn\" (UniqueName: \"kubernetes.io/projected/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-kube-api-access-dvqkn\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.290570 master-0 kubenswrapper[15202]: I0319 09:53:20.290555 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-scripts\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.326289 master-0 kubenswrapper[15202]: I0319 09:53:20.323817 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-t8sfd"] Mar 19 09:53:20.394118 master-0 kubenswrapper[15202]: I0319 09:53:20.394061 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-scripts\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.394362 master-0 kubenswrapper[15202]: I0319 09:53:20.394134 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-config-data\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.394362 master-0 kubenswrapper[15202]: I0319 09:53:20.394263 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.394362 master-0 kubenswrapper[15202]: I0319 09:53:20.394312 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvqkn\" (UniqueName: \"kubernetes.io/projected/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-kube-api-access-dvqkn\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.399118 master-0 kubenswrapper[15202]: I0319 09:53:20.399081 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.400134 master-0 kubenswrapper[15202]: I0319 09:53:20.400100 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-scripts\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.401384 master-0 kubenswrapper[15202]: I0319 09:53:20.401340 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-config-data\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.448724 master-0 kubenswrapper[15202]: I0319 09:53:20.448679 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvqkn\" (UniqueName: \"kubernetes.io/projected/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-kube-api-access-dvqkn\") pod \"nova-cell0-cell-mapping-t8sfd\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.578853 master-0 kubenswrapper[15202]: I0319 09:53:20.578556 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:20.609601 master-0 kubenswrapper[15202]: I0319 09:53:20.609537 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:20.611934 master-0 kubenswrapper[15202]: I0319 09:53:20.611894 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:53:20.615918 master-0 kubenswrapper[15202]: I0319 09:53:20.614761 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:53:20.734067 master-0 kubenswrapper[15202]: I0319 09:53:20.733979 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-logs\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.734406 master-0 kubenswrapper[15202]: I0319 09:53:20.734109 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkxw\" (UniqueName: \"kubernetes.io/projected/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-kube-api-access-skkxw\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.734406 master-0 kubenswrapper[15202]: I0319 09:53:20.734161 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-config-data\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.734406 master-0 kubenswrapper[15202]: I0319 09:53:20.734185 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.809568 master-0 kubenswrapper[15202]: I0319 09:53:20.809373 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:20.850491 master-0 kubenswrapper[15202]: I0319 09:53:20.847997 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-logs\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.850491 master-0 kubenswrapper[15202]: I0319 09:53:20.848105 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkxw\" (UniqueName: \"kubernetes.io/projected/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-kube-api-access-skkxw\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.850491 master-0 kubenswrapper[15202]: I0319 09:53:20.848146 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-config-data\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.850491 master-0 kubenswrapper[15202]: I0319 09:53:20.848165 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.866890 master-0 kubenswrapper[15202]: I0319 09:53:20.861414 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-logs\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.880498 master-0 kubenswrapper[15202]: I0319 09:53:20.872698 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.908003 master-0 kubenswrapper[15202]: I0319 09:53:20.884657 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-config-data\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.909428 master-0 kubenswrapper[15202]: I0319 09:53:20.909377 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:53:20.925657 master-0 kubenswrapper[15202]: I0319 09:53:20.912220 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:20.934503 master-0 kubenswrapper[15202]: I0319 09:53:20.932956 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 09:53:20.934503 master-0 kubenswrapper[15202]: I0319 09:53:20.934259 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkxw\" (UniqueName: \"kubernetes.io/projected/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-kube-api-access-skkxw\") pod \"nova-api-0\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " pod="openstack/nova-api-0" Mar 19 09:53:20.950726 master-0 kubenswrapper[15202]: I0319 09:53:20.949813 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:20.950726 master-0 kubenswrapper[15202]: I0319 09:53:20.949977 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:20.950726 master-0 kubenswrapper[15202]: I0319 09:53:20.950282 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgpz\" (UniqueName: \"kubernetes.io/projected/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-kube-api-access-jhgpz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.175552 master-0 kubenswrapper[15202]: I0319 09:53:21.138662 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:53:21.183815 master-0 kubenswrapper[15202]: I0319 09:53:21.182966 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgpz\" (UniqueName: \"kubernetes.io/projected/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-kube-api-access-jhgpz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.183815 master-0 kubenswrapper[15202]: I0319 09:53:21.183769 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.189252 master-0 kubenswrapper[15202]: I0319 09:53:21.188580 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.190337 master-0 kubenswrapper[15202]: I0319 09:53:21.190284 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.193169 master-0 kubenswrapper[15202]: I0319 09:53:21.193144 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.202657 master-0 kubenswrapper[15202]: I0319 09:53:21.202540 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:53:21.229489 master-0 kubenswrapper[15202]: I0319 09:53:21.223712 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgpz\" (UniqueName: \"kubernetes.io/projected/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-kube-api-access-jhgpz\") pod \"nova-cell1-novncproxy-0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.273555 master-0 kubenswrapper[15202]: I0319 09:53:21.263249 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:21.273555 master-0 kubenswrapper[15202]: I0319 09:53:21.270669 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:21.281158 master-0 kubenswrapper[15202]: I0319 09:53:21.278006 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:53:21.296503 master-0 kubenswrapper[15202]: I0319 09:53:21.290603 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e1aa29a-415f-42b9-83c5-271c70978d3a-logs\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.296503 master-0 kubenswrapper[15202]: I0319 09:53:21.290662 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.296503 master-0 kubenswrapper[15202]: I0319 09:53:21.290788 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cslc\" (UniqueName: \"kubernetes.io/projected/1e1aa29a-415f-42b9-83c5-271c70978d3a-kube-api-access-7cslc\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.296503 master-0 kubenswrapper[15202]: I0319 09:53:21.290826 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-config-data\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.296503 master-0 kubenswrapper[15202]: I0319 09:53:21.293147 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:21.307492 master-0 kubenswrapper[15202]: I0319 09:53:21.298044 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:53:21.307492 master-0 kubenswrapper[15202]: I0319 09:53:21.302216 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 09:53:21.318515 master-0 kubenswrapper[15202]: I0319 09:53:21.313509 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:21.344747 master-0 kubenswrapper[15202]: I0319 09:53:21.344690 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:21.368788 master-0 kubenswrapper[15202]: W0319 09:53:21.365880 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e9fe9a_8318_49fe_a5c6_b01a9737d2e3.slice/crio-d111f7dec4a17dc66d70dcd7258d61b9556a198dcb83f844fc1b9d9b4fb9bc5c WatchSource:0}: Error finding container d111f7dec4a17dc66d70dcd7258d61b9556a198dcb83f844fc1b9d9b4fb9bc5c: Status 404 returned error can't find the container with id d111f7dec4a17dc66d70dcd7258d61b9556a198dcb83f844fc1b9d9b4fb9bc5c Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.392208 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4cc6f549-55sdk"] Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.394066 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e1aa29a-415f-42b9-83c5-271c70978d3a-logs\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.394434 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e1aa29a-415f-42b9-83c5-271c70978d3a-logs\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.397282 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.397612 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-config-data\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.397762 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cslc\" (UniqueName: \"kubernetes.io/projected/1e1aa29a-415f-42b9-83c5-271c70978d3a-kube-api-access-7cslc\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.397812 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.397847 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nttmh\" (UniqueName: \"kubernetes.io/projected/d493f151-b19b-4399-a5cc-cf611fc5e727-kube-api-access-nttmh\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.397877 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-config-data\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.399486 master-0 kubenswrapper[15202]: I0319 09:53:21.398245 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.403857 master-0 kubenswrapper[15202]: I0319 09:53:21.403773 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-config-data\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.424488 master-0 kubenswrapper[15202]: I0319 09:53:21.418671 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cslc\" (UniqueName: \"kubernetes.io/projected/1e1aa29a-415f-42b9-83c5-271c70978d3a-kube-api-access-7cslc\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.437840 master-0 kubenswrapper[15202]: I0319 09:53:21.436143 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4cc6f549-55sdk"] Mar 19 09:53:21.437840 master-0 kubenswrapper[15202]: I0319 09:53:21.437355 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " pod="openstack/nova-metadata-0" Mar 19 09:53:21.452240 master-0 kubenswrapper[15202]: I0319 09:53:21.446817 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:21.498384 master-0 kubenswrapper[15202]: I0319 09:53:21.498332 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-t8sfd"] Mar 19 09:53:21.503098 master-0 kubenswrapper[15202]: I0319 09:53:21.503071 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-sb\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.507183 master-0 kubenswrapper[15202]: I0319 09:53:21.507156 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nttmh\" (UniqueName: \"kubernetes.io/projected/d493f151-b19b-4399-a5cc-cf611fc5e727-kube-api-access-nttmh\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.507316 master-0 kubenswrapper[15202]: I0319 09:53:21.507303 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-b\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.507569 master-0 kubenswrapper[15202]: I0319 09:53:21.507550 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-svc\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.508573 master-0 kubenswrapper[15202]: I0319 09:53:21.508552 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-config\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.508797 master-0 kubenswrapper[15202]: I0319 09:53:21.508781 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-config-data\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.510384 master-0 kubenswrapper[15202]: I0319 09:53:21.509819 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-a\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.510384 master-0 kubenswrapper[15202]: I0319 09:53:21.509891 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2562b\" (UniqueName: \"kubernetes.io/projected/3a4ea6bb-8177-449b-a022-fb62033cd8c9-kube-api-access-2562b\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.510384 master-0 kubenswrapper[15202]: I0319 09:53:21.509946 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-nb\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.510384 master-0 kubenswrapper[15202]: I0319 09:53:21.510057 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.510384 master-0 kubenswrapper[15202]: I0319 09:53:21.510098 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-swift-storage-0\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.516236 master-0 kubenswrapper[15202]: I0319 09:53:21.516190 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.544667 master-0 kubenswrapper[15202]: I0319 09:53:21.544157 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-config-data\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.585496 master-0 kubenswrapper[15202]: I0319 09:53:21.577212 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nttmh\" (UniqueName: \"kubernetes.io/projected/d493f151-b19b-4399-a5cc-cf611fc5e727-kube-api-access-nttmh\") pod \"nova-scheduler-0\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:21.611491 master-0 kubenswrapper[15202]: I0319 09:53:21.611111 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621315 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-nb\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621388 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-swift-storage-0\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621413 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-sb\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621433 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-b\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621493 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-svc\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621588 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-config\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621672 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-a\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.621698 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2562b\" (UniqueName: \"kubernetes.io/projected/3a4ea6bb-8177-449b-a022-fb62033cd8c9-kube-api-access-2562b\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.624060 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-b\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.624665 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-swift-storage-0\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.625207 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-sb\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.640261 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-config\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.642083 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-svc\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.642692 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-a\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.656019 master-0 kubenswrapper[15202]: I0319 09:53:21.643334 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-nb\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.663188 master-0 kubenswrapper[15202]: I0319 09:53:21.663135 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:53:21.677653 master-0 kubenswrapper[15202]: I0319 09:53:21.671011 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2562b\" (UniqueName: \"kubernetes.io/projected/3a4ea6bb-8177-449b-a022-fb62033cd8c9-kube-api-access-2562b\") pod \"dnsmasq-dns-b4cc6f549-55sdk\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.791499 master-0 kubenswrapper[15202]: I0319 09:53:21.776907 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:21.884307 master-0 kubenswrapper[15202]: I0319 09:53:21.872631 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:22.175304 master-0 kubenswrapper[15202]: I0319 09:53:22.175049 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s6rxr"] Mar 19 09:53:22.178290 master-0 kubenswrapper[15202]: I0319 09:53:22.178231 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.207920 master-0 kubenswrapper[15202]: I0319 09:53:22.181093 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 09:53:22.207920 master-0 kubenswrapper[15202]: I0319 09:53:22.183625 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 09:53:22.236551 master-0 kubenswrapper[15202]: I0319 09:53:22.236339 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s6rxr"] Mar 19 09:53:22.261406 master-0 kubenswrapper[15202]: I0319 09:53:22.261296 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:53:22.326591 master-0 kubenswrapper[15202]: I0319 09:53:22.326514 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-config-data\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.327494 master-0 kubenswrapper[15202]: I0319 09:53:22.327448 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-scripts\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.327742 master-0 kubenswrapper[15202]: I0319 09:53:22.327705 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzzt\" (UniqueName: \"kubernetes.io/projected/499db210-7cab-4a33-99b1-3be10260b2c2-kube-api-access-2fzzt\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.328302 master-0 kubenswrapper[15202]: I0319 09:53:22.328282 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.330012 master-0 kubenswrapper[15202]: I0319 09:53:22.329883 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t8sfd" event={"ID":"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3","Type":"ContainerStarted","Data":"7a3db32261122d75f83eb172b197008388cf30d3d9bff2ad096104ff269a2851"} Mar 19 09:53:22.330012 master-0 kubenswrapper[15202]: I0319 09:53:22.329935 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t8sfd" event={"ID":"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3","Type":"ContainerStarted","Data":"d111f7dec4a17dc66d70dcd7258d61b9556a198dcb83f844fc1b9d9b4fb9bc5c"} Mar 19 09:53:22.333690 master-0 kubenswrapper[15202]: I0319 09:53:22.333549 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"82f5e985-3f6a-4c95-a8b1-b107ff60cf25","Type":"ContainerStarted","Data":"b0a0b3a4a99cb97944a7aba2e47d4614990e52ae8399aeda105ddf3f668e38e9"} Mar 19 09:53:22.336553 master-0 kubenswrapper[15202]: I0319 09:53:22.336510 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0","Type":"ContainerStarted","Data":"2a476d0642f428b76d69459d9717142cafe7dfddfb486340d2baf04bf1d2d260"} Mar 19 09:53:22.369688 master-0 kubenswrapper[15202]: I0319 09:53:22.369600 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-t8sfd" podStartSLOduration=2.369572822 podStartE2EDuration="2.369572822s" podCreationTimestamp="2026-03-19 09:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:22.353878754 +0000 UTC m=+1719.739293570" watchObservedRunningTime="2026-03-19 09:53:22.369572822 +0000 UTC m=+1719.754987648" Mar 19 09:53:22.431834 master-0 kubenswrapper[15202]: I0319 09:53:22.431576 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-config-data\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.432091 master-0 kubenswrapper[15202]: I0319 09:53:22.431846 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-scripts\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.432091 master-0 kubenswrapper[15202]: I0319 09:53:22.431897 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzzt\" (UniqueName: \"kubernetes.io/projected/499db210-7cab-4a33-99b1-3be10260b2c2-kube-api-access-2fzzt\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.437654 master-0 kubenswrapper[15202]: I0319 09:53:22.435718 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.445087 master-0 kubenswrapper[15202]: I0319 09:53:22.445038 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.445482 master-0 kubenswrapper[15202]: I0319 09:53:22.445389 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-scripts\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.446811 master-0 kubenswrapper[15202]: I0319 09:53:22.446782 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-config-data\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.460048 master-0 kubenswrapper[15202]: I0319 09:53:22.459998 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzzt\" (UniqueName: \"kubernetes.io/projected/499db210-7cab-4a33-99b1-3be10260b2c2-kube-api-access-2fzzt\") pod \"nova-cell1-conductor-db-sync-s6rxr\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.552190 master-0 kubenswrapper[15202]: W0319 09:53:22.552140 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e1aa29a_415f_42b9_83c5_271c70978d3a.slice/crio-95c3d6cfff16080dfe01ab491118ac6610a912788a06f9e38dfdf0b6e3a38aca WatchSource:0}: Error finding container 95c3d6cfff16080dfe01ab491118ac6610a912788a06f9e38dfdf0b6e3a38aca: Status 404 returned error can't find the container with id 95c3d6cfff16080dfe01ab491118ac6610a912788a06f9e38dfdf0b6e3a38aca Mar 19 09:53:22.552190 master-0 kubenswrapper[15202]: I0319 09:53:22.552171 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:22.566363 master-0 kubenswrapper[15202]: I0319 09:53:22.566306 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:22.694200 master-0 kubenswrapper[15202]: I0319 09:53:22.687313 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4cc6f549-55sdk"] Mar 19 09:53:22.708482 master-0 kubenswrapper[15202]: I0319 09:53:22.703085 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:22.719963 master-0 kubenswrapper[15202]: W0319 09:53:22.719817 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a4ea6bb_8177_449b_a022_fb62033cd8c9.slice/crio-90e2789902082dd41b77a267a99c3537ad92d1ad7cfa99ee0a1e1a4058b055d5 WatchSource:0}: Error finding container 90e2789902082dd41b77a267a99c3537ad92d1ad7cfa99ee0a1e1a4058b055d5: Status 404 returned error can't find the container with id 90e2789902082dd41b77a267a99c3537ad92d1ad7cfa99ee0a1e1a4058b055d5 Mar 19 09:53:23.345966 master-0 kubenswrapper[15202]: I0319 09:53:23.345893 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s6rxr"] Mar 19 09:53:23.381692 master-0 kubenswrapper[15202]: I0319 09:53:23.381636 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d493f151-b19b-4399-a5cc-cf611fc5e727","Type":"ContainerStarted","Data":"6006a377cd6492eeb20723837f3374bc15d1af5b9cf959d152eecc4f3493b8ca"} Mar 19 09:53:23.392458 master-0 kubenswrapper[15202]: I0319 09:53:23.392402 15202 generic.go:334] "Generic (PLEG): container finished" podID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerID="8d26556787d3f6581b20b2c6d142273913ed9a54cf40777a695cffab286be379" exitCode=0 Mar 19 09:53:23.392775 master-0 kubenswrapper[15202]: I0319 09:53:23.392520 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" event={"ID":"3a4ea6bb-8177-449b-a022-fb62033cd8c9","Type":"ContainerDied","Data":"8d26556787d3f6581b20b2c6d142273913ed9a54cf40777a695cffab286be379"} Mar 19 09:53:23.392775 master-0 kubenswrapper[15202]: I0319 09:53:23.392599 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" event={"ID":"3a4ea6bb-8177-449b-a022-fb62033cd8c9","Type":"ContainerStarted","Data":"90e2789902082dd41b77a267a99c3537ad92d1ad7cfa99ee0a1e1a4058b055d5"} Mar 19 09:53:23.402457 master-0 kubenswrapper[15202]: I0319 09:53:23.402357 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e1aa29a-415f-42b9-83c5-271c70978d3a","Type":"ContainerStarted","Data":"95c3d6cfff16080dfe01ab491118ac6610a912788a06f9e38dfdf0b6e3a38aca"} Mar 19 09:53:24.427319 master-0 kubenswrapper[15202]: I0319 09:53:24.427262 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" event={"ID":"499db210-7cab-4a33-99b1-3be10260b2c2","Type":"ContainerStarted","Data":"ba412efe7e3ca4d9971f84136722c6affc1bff2c8e4b512852b7658cf73ec6d3"} Mar 19 09:53:25.390495 master-0 kubenswrapper[15202]: I0319 09:53:25.390382 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:25.409674 master-0 kubenswrapper[15202]: I0319 09:53:25.409515 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:53:25.442352 master-0 kubenswrapper[15202]: I0319 09:53:25.442282 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" event={"ID":"499db210-7cab-4a33-99b1-3be10260b2c2","Type":"ContainerStarted","Data":"e6bc5b6305aa59c429b80b85e16d99b583ea566e1a6003529d789d35901aa80e"} Mar 19 09:53:25.500502 master-0 kubenswrapper[15202]: I0319 09:53:25.498763 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" podStartSLOduration=3.498738691 podStartE2EDuration="3.498738691s" podCreationTimestamp="2026-03-19 09:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:25.493228644 +0000 UTC m=+1722.878643460" watchObservedRunningTime="2026-03-19 09:53:25.498738691 +0000 UTC m=+1722.884153507" Mar 19 09:53:26.471493 master-0 kubenswrapper[15202]: I0319 09:53:26.469682 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0","Type":"ContainerStarted","Data":"bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21"} Mar 19 09:53:26.471493 master-0 kubenswrapper[15202]: I0319 09:53:26.469872 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21" gracePeriod=30 Mar 19 09:53:26.486502 master-0 kubenswrapper[15202]: I0319 09:53:26.484860 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:26.490718 master-0 kubenswrapper[15202]: I0319 09:53:26.488182 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e1aa29a-415f-42b9-83c5-271c70978d3a","Type":"ContainerStarted","Data":"1e73e9e06ce4dbfdeefe190b09f4137eadfe2f1bd7c7509d84d03fcaa981f732"} Mar 19 09:53:26.510494 master-0 kubenswrapper[15202]: I0319 09:53:26.510317 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8957479189999997 podStartE2EDuration="6.510299434s" podCreationTimestamp="2026-03-19 09:53:20 +0000 UTC" firstStartedPulling="2026-03-19 09:53:22.263382473 +0000 UTC m=+1719.648797289" lastFinishedPulling="2026-03-19 09:53:25.877933988 +0000 UTC m=+1723.263348804" observedRunningTime="2026-03-19 09:53:26.493809896 +0000 UTC m=+1723.879224712" watchObservedRunningTime="2026-03-19 09:53:26.510299434 +0000 UTC m=+1723.895714250" Mar 19 09:53:26.569263 master-0 kubenswrapper[15202]: I0319 09:53:26.569162 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" podStartSLOduration=5.569141921 podStartE2EDuration="5.569141921s" podCreationTimestamp="2026-03-19 09:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:26.557560753 +0000 UTC m=+1723.942975569" watchObservedRunningTime="2026-03-19 09:53:26.569141921 +0000 UTC m=+1723.954556737" Mar 19 09:53:27.510072 master-0 kubenswrapper[15202]: I0319 09:53:27.509582 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d493f151-b19b-4399-a5cc-cf611fc5e727","Type":"ContainerStarted","Data":"58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de"} Mar 19 09:53:27.516505 master-0 kubenswrapper[15202]: I0319 09:53:27.515990 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" event={"ID":"3a4ea6bb-8177-449b-a022-fb62033cd8c9","Type":"ContainerStarted","Data":"31fb14aabb59297cd0e36e36253e342817d1d743c51dbbec04577697acb1dfb0"} Mar 19 09:53:27.519494 master-0 kubenswrapper[15202]: I0319 09:53:27.519208 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e1aa29a-415f-42b9-83c5-271c70978d3a","Type":"ContainerStarted","Data":"2e5aa9976888f5980e1ce97a4c5ac9c90d3a439fd23fb350e05f3698b726b587"} Mar 19 09:53:27.519494 master-0 kubenswrapper[15202]: I0319 09:53:27.519347 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-log" containerID="cri-o://1e73e9e06ce4dbfdeefe190b09f4137eadfe2f1bd7c7509d84d03fcaa981f732" gracePeriod=30 Mar 19 09:53:27.520228 master-0 kubenswrapper[15202]: I0319 09:53:27.519920 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-metadata" containerID="cri-o://2e5aa9976888f5980e1ce97a4c5ac9c90d3a439fd23fb350e05f3698b726b587" gracePeriod=30 Mar 19 09:53:27.524237 master-0 kubenswrapper[15202]: I0319 09:53:27.523960 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"82f5e985-3f6a-4c95-a8b1-b107ff60cf25","Type":"ContainerStarted","Data":"0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87"} Mar 19 09:53:27.524237 master-0 kubenswrapper[15202]: I0319 09:53:27.524002 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"82f5e985-3f6a-4c95-a8b1-b107ff60cf25","Type":"ContainerStarted","Data":"53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449"} Mar 19 09:53:27.679165 master-0 kubenswrapper[15202]: I0319 09:53:27.679028 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.509120111 podStartE2EDuration="7.679006517s" podCreationTimestamp="2026-03-19 09:53:20 +0000 UTC" firstStartedPulling="2026-03-19 09:53:22.710195096 +0000 UTC m=+1720.095609912" lastFinishedPulling="2026-03-19 09:53:25.880081502 +0000 UTC m=+1723.265496318" observedRunningTime="2026-03-19 09:53:27.676797453 +0000 UTC m=+1725.062212269" watchObservedRunningTime="2026-03-19 09:53:27.679006517 +0000 UTC m=+1725.064421333" Mar 19 09:53:27.840011 master-0 kubenswrapper[15202]: I0319 09:53:27.839888 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.514739139 podStartE2EDuration="7.839865879s" podCreationTimestamp="2026-03-19 09:53:20 +0000 UTC" firstStartedPulling="2026-03-19 09:53:22.555531336 +0000 UTC m=+1719.940946152" lastFinishedPulling="2026-03-19 09:53:25.880658066 +0000 UTC m=+1723.266072892" observedRunningTime="2026-03-19 09:53:27.8326044 +0000 UTC m=+1725.218019216" watchObservedRunningTime="2026-03-19 09:53:27.839865879 +0000 UTC m=+1725.225280705" Mar 19 09:53:27.874566 master-0 kubenswrapper[15202]: I0319 09:53:27.874449 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.9380523629999997 podStartE2EDuration="7.874428585s" podCreationTimestamp="2026-03-19 09:53:20 +0000 UTC" firstStartedPulling="2026-03-19 09:53:21.9408931 +0000 UTC m=+1719.326307916" lastFinishedPulling="2026-03-19 09:53:25.877269322 +0000 UTC m=+1723.262684138" observedRunningTime="2026-03-19 09:53:27.868087868 +0000 UTC m=+1725.253502694" watchObservedRunningTime="2026-03-19 09:53:27.874428585 +0000 UTC m=+1725.259843421" Mar 19 09:53:28.562973 master-0 kubenswrapper[15202]: I0319 09:53:28.562183 15202 generic.go:334] "Generic (PLEG): container finished" podID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerID="2e5aa9976888f5980e1ce97a4c5ac9c90d3a439fd23fb350e05f3698b726b587" exitCode=0 Mar 19 09:53:28.562973 master-0 kubenswrapper[15202]: I0319 09:53:28.562223 15202 generic.go:334] "Generic (PLEG): container finished" podID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerID="1e73e9e06ce4dbfdeefe190b09f4137eadfe2f1bd7c7509d84d03fcaa981f732" exitCode=143 Mar 19 09:53:28.564321 master-0 kubenswrapper[15202]: I0319 09:53:28.563781 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e1aa29a-415f-42b9-83c5-271c70978d3a","Type":"ContainerDied","Data":"2e5aa9976888f5980e1ce97a4c5ac9c90d3a439fd23fb350e05f3698b726b587"} Mar 19 09:53:28.564321 master-0 kubenswrapper[15202]: I0319 09:53:28.563819 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e1aa29a-415f-42b9-83c5-271c70978d3a","Type":"ContainerDied","Data":"1e73e9e06ce4dbfdeefe190b09f4137eadfe2f1bd7c7509d84d03fcaa981f732"} Mar 19 09:53:28.801347 master-0 kubenswrapper[15202]: I0319 09:53:28.801294 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:28.900997 master-0 kubenswrapper[15202]: I0319 09:53:28.900937 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-config-data\") pod \"1e1aa29a-415f-42b9-83c5-271c70978d3a\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " Mar 19 09:53:28.901271 master-0 kubenswrapper[15202]: I0319 09:53:28.901079 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cslc\" (UniqueName: \"kubernetes.io/projected/1e1aa29a-415f-42b9-83c5-271c70978d3a-kube-api-access-7cslc\") pod \"1e1aa29a-415f-42b9-83c5-271c70978d3a\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " Mar 19 09:53:28.901271 master-0 kubenswrapper[15202]: I0319 09:53:28.901134 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-combined-ca-bundle\") pod \"1e1aa29a-415f-42b9-83c5-271c70978d3a\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " Mar 19 09:53:28.901271 master-0 kubenswrapper[15202]: I0319 09:53:28.901210 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e1aa29a-415f-42b9-83c5-271c70978d3a-logs\") pod \"1e1aa29a-415f-42b9-83c5-271c70978d3a\" (UID: \"1e1aa29a-415f-42b9-83c5-271c70978d3a\") " Mar 19 09:53:28.901683 master-0 kubenswrapper[15202]: I0319 09:53:28.901640 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e1aa29a-415f-42b9-83c5-271c70978d3a-logs" (OuterVolumeSpecName: "logs") pod "1e1aa29a-415f-42b9-83c5-271c70978d3a" (UID: "1e1aa29a-415f-42b9-83c5-271c70978d3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:53:28.902835 master-0 kubenswrapper[15202]: I0319 09:53:28.902806 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e1aa29a-415f-42b9-83c5-271c70978d3a-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:28.909573 master-0 kubenswrapper[15202]: I0319 09:53:28.908725 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1aa29a-415f-42b9-83c5-271c70978d3a-kube-api-access-7cslc" (OuterVolumeSpecName: "kube-api-access-7cslc") pod "1e1aa29a-415f-42b9-83c5-271c70978d3a" (UID: "1e1aa29a-415f-42b9-83c5-271c70978d3a"). InnerVolumeSpecName "kube-api-access-7cslc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:28.928767 master-0 kubenswrapper[15202]: I0319 09:53:28.928704 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-config-data" (OuterVolumeSpecName: "config-data") pod "1e1aa29a-415f-42b9-83c5-271c70978d3a" (UID: "1e1aa29a-415f-42b9-83c5-271c70978d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:28.933391 master-0 kubenswrapper[15202]: I0319 09:53:28.933348 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e1aa29a-415f-42b9-83c5-271c70978d3a" (UID: "1e1aa29a-415f-42b9-83c5-271c70978d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:29.005552 master-0 kubenswrapper[15202]: I0319 09:53:29.005498 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:29.005831 master-0 kubenswrapper[15202]: I0319 09:53:29.005809 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cslc\" (UniqueName: \"kubernetes.io/projected/1e1aa29a-415f-42b9-83c5-271c70978d3a-kube-api-access-7cslc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:29.005986 master-0 kubenswrapper[15202]: I0319 09:53:29.005969 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1aa29a-415f-42b9-83c5-271c70978d3a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:29.580193 master-0 kubenswrapper[15202]: I0319 09:53:29.580025 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1e1aa29a-415f-42b9-83c5-271c70978d3a","Type":"ContainerDied","Data":"95c3d6cfff16080dfe01ab491118ac6610a912788a06f9e38dfdf0b6e3a38aca"} Mar 19 09:53:29.580193 master-0 kubenswrapper[15202]: I0319 09:53:29.580184 15202 scope.go:117] "RemoveContainer" containerID="2e5aa9976888f5980e1ce97a4c5ac9c90d3a439fd23fb350e05f3698b726b587" Mar 19 09:53:29.580809 master-0 kubenswrapper[15202]: I0319 09:53:29.580438 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:29.650975 master-0 kubenswrapper[15202]: I0319 09:53:29.635923 15202 scope.go:117] "RemoveContainer" containerID="1e73e9e06ce4dbfdeefe190b09f4137eadfe2f1bd7c7509d84d03fcaa981f732" Mar 19 09:53:29.663154 master-0 kubenswrapper[15202]: I0319 09:53:29.662873 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:29.676042 master-0 kubenswrapper[15202]: I0319 09:53:29.675009 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:29.707359 master-0 kubenswrapper[15202]: I0319 09:53:29.707279 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:29.707965 master-0 kubenswrapper[15202]: E0319 09:53:29.707935 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-log" Mar 19 09:53:29.707965 master-0 kubenswrapper[15202]: I0319 09:53:29.707960 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-log" Mar 19 09:53:29.708088 master-0 kubenswrapper[15202]: E0319 09:53:29.707971 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-metadata" Mar 19 09:53:29.708088 master-0 kubenswrapper[15202]: I0319 09:53:29.707978 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-metadata" Mar 19 09:53:29.708257 master-0 kubenswrapper[15202]: I0319 09:53:29.708235 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-metadata" Mar 19 09:53:29.708299 master-0 kubenswrapper[15202]: I0319 09:53:29.708270 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" containerName="nova-metadata-log" Mar 19 09:53:29.709573 master-0 kubenswrapper[15202]: I0319 09:53:29.709543 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:29.711258 master-0 kubenswrapper[15202]: I0319 09:53:29.711232 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:53:29.712414 master-0 kubenswrapper[15202]: I0319 09:53:29.712385 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 09:53:29.742567 master-0 kubenswrapper[15202]: I0319 09:53:29.742498 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:29.827304 master-0 kubenswrapper[15202]: I0319 09:53:29.827235 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-logs\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.827542 master-0 kubenswrapper[15202]: I0319 09:53:29.827325 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-config-data\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.827542 master-0 kubenswrapper[15202]: I0319 09:53:29.827452 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.827542 master-0 kubenswrapper[15202]: I0319 09:53:29.827535 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmkq\" (UniqueName: \"kubernetes.io/projected/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-kube-api-access-gmmkq\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.827861 master-0 kubenswrapper[15202]: I0319 09:53:29.827794 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.930370 master-0 kubenswrapper[15202]: I0319 09:53:29.930286 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-logs\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.930370 master-0 kubenswrapper[15202]: I0319 09:53:29.930343 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-config-data\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.930902 master-0 kubenswrapper[15202]: I0319 09:53:29.930860 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-logs\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.931208 master-0 kubenswrapper[15202]: I0319 09:53:29.931164 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.932877 master-0 kubenswrapper[15202]: I0319 09:53:29.931301 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmkq\" (UniqueName: \"kubernetes.io/projected/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-kube-api-access-gmmkq\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.932877 master-0 kubenswrapper[15202]: I0319 09:53:29.932336 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.936202 master-0 kubenswrapper[15202]: I0319 09:53:29.935701 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-config-data\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.936926 master-0 kubenswrapper[15202]: I0319 09:53:29.936873 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.938369 master-0 kubenswrapper[15202]: I0319 09:53:29.938294 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:29.948693 master-0 kubenswrapper[15202]: I0319 09:53:29.948640 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmkq\" (UniqueName: \"kubernetes.io/projected/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-kube-api-access-gmmkq\") pod \"nova-metadata-0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " pod="openstack/nova-metadata-0" Mar 19 09:53:30.029343 master-0 kubenswrapper[15202]: I0319 09:53:30.029292 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:30.534663 master-0 kubenswrapper[15202]: W0319 09:53:30.534599 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbc9c3f9_4518_4d1a_b065_3eebf19dc8c0.slice/crio-2308f591ea601dfb0a8a4e156086437d810500ff8b3de67f70f9762d7c5fe7ee WatchSource:0}: Error finding container 2308f591ea601dfb0a8a4e156086437d810500ff8b3de67f70f9762d7c5fe7ee: Status 404 returned error can't find the container with id 2308f591ea601dfb0a8a4e156086437d810500ff8b3de67f70f9762d7c5fe7ee Mar 19 09:53:30.537326 master-0 kubenswrapper[15202]: I0319 09:53:30.537098 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:30.608218 master-0 kubenswrapper[15202]: I0319 09:53:30.608144 15202 generic.go:334] "Generic (PLEG): container finished" podID="a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" containerID="7a3db32261122d75f83eb172b197008388cf30d3d9bff2ad096104ff269a2851" exitCode=0 Mar 19 09:53:30.610810 master-0 kubenswrapper[15202]: I0319 09:53:30.608278 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t8sfd" event={"ID":"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3","Type":"ContainerDied","Data":"7a3db32261122d75f83eb172b197008388cf30d3d9bff2ad096104ff269a2851"} Mar 19 09:53:30.614928 master-0 kubenswrapper[15202]: I0319 09:53:30.614882 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0","Type":"ContainerStarted","Data":"2308f591ea601dfb0a8a4e156086437d810500ff8b3de67f70f9762d7c5fe7ee"} Mar 19 09:53:30.835583 master-0 kubenswrapper[15202]: I0319 09:53:30.835505 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1aa29a-415f-42b9-83c5-271c70978d3a" path="/var/lib/kubelet/pods/1e1aa29a-415f-42b9-83c5-271c70978d3a/volumes" Mar 19 09:53:31.143272 master-0 kubenswrapper[15202]: I0319 09:53:31.143146 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:53:31.143272 master-0 kubenswrapper[15202]: I0319 09:53:31.143207 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:53:31.448116 master-0 kubenswrapper[15202]: I0319 09:53:31.448051 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:31.629943 master-0 kubenswrapper[15202]: I0319 09:53:31.629861 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0","Type":"ContainerStarted","Data":"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933"} Mar 19 09:53:31.629943 master-0 kubenswrapper[15202]: I0319 09:53:31.629930 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0","Type":"ContainerStarted","Data":"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e"} Mar 19 09:53:31.658717 master-0 kubenswrapper[15202]: I0319 09:53:31.658500 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.658459586 podStartE2EDuration="2.658459586s" podCreationTimestamp="2026-03-19 09:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:31.65740293 +0000 UTC m=+1729.042817746" watchObservedRunningTime="2026-03-19 09:53:31.658459586 +0000 UTC m=+1729.043874402" Mar 19 09:53:31.665075 master-0 kubenswrapper[15202]: I0319 09:53:31.665014 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 09:53:31.665075 master-0 kubenswrapper[15202]: I0319 09:53:31.665068 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 09:53:31.705694 master-0 kubenswrapper[15202]: I0319 09:53:31.705562 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 09:53:31.778967 master-0 kubenswrapper[15202]: I0319 09:53:31.778909 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:53:31.896526 master-0 kubenswrapper[15202]: I0319 09:53:31.894644 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6897ccd865-b6qgp"] Mar 19 09:53:31.896526 master-0 kubenswrapper[15202]: I0319 09:53:31.894970 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerName="dnsmasq-dns" containerID="cri-o://c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd" gracePeriod=10 Mar 19 09:53:32.224494 master-0 kubenswrapper[15202]: I0319 09:53:32.219864 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:32.246574 master-0 kubenswrapper[15202]: I0319 09:53:32.244970 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:53:32.246574 master-0 kubenswrapper[15202]: I0319 09:53:32.245307 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:53:32.308598 master-0 kubenswrapper[15202]: I0319 09:53:32.308532 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-combined-ca-bundle\") pod \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " Mar 19 09:53:32.309259 master-0 kubenswrapper[15202]: I0319 09:53:32.309234 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-config-data\") pod \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " Mar 19 09:53:32.309584 master-0 kubenswrapper[15202]: I0319 09:53:32.309537 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-scripts\") pod \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " Mar 19 09:53:32.309723 master-0 kubenswrapper[15202]: I0319 09:53:32.309635 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvqkn\" (UniqueName: \"kubernetes.io/projected/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-kube-api-access-dvqkn\") pod \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\" (UID: \"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3\") " Mar 19 09:53:32.313706 master-0 kubenswrapper[15202]: I0319 09:53:32.313614 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-kube-api-access-dvqkn" (OuterVolumeSpecName: "kube-api-access-dvqkn") pod "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" (UID: "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3"). InnerVolumeSpecName "kube-api-access-dvqkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:32.319459 master-0 kubenswrapper[15202]: I0319 09:53:32.319415 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-scripts" (OuterVolumeSpecName: "scripts") pod "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" (UID: "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:32.348922 master-0 kubenswrapper[15202]: I0319 09:53:32.348853 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-config-data" (OuterVolumeSpecName: "config-data") pod "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" (UID: "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:32.418931 master-0 kubenswrapper[15202]: I0319 09:53:32.418866 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.418931 master-0 kubenswrapper[15202]: I0319 09:53:32.418920 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.418931 master-0 kubenswrapper[15202]: I0319 09:53:32.418936 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvqkn\" (UniqueName: \"kubernetes.io/projected/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-kube-api-access-dvqkn\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.452215 master-0 kubenswrapper[15202]: I0319 09:53:32.452138 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" (UID: "a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:32.521345 master-0 kubenswrapper[15202]: I0319 09:53:32.521213 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.556173 master-0 kubenswrapper[15202]: I0319 09:53:32.556110 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:53:32.670053 master-0 kubenswrapper[15202]: I0319 09:53:32.662065 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-t8sfd" event={"ID":"a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3","Type":"ContainerDied","Data":"d111f7dec4a17dc66d70dcd7258d61b9556a198dcb83f844fc1b9d9b4fb9bc5c"} Mar 19 09:53:32.670053 master-0 kubenswrapper[15202]: I0319 09:53:32.662130 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d111f7dec4a17dc66d70dcd7258d61b9556a198dcb83f844fc1b9d9b4fb9bc5c" Mar 19 09:53:32.670053 master-0 kubenswrapper[15202]: I0319 09:53:32.662233 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-t8sfd" Mar 19 09:53:32.690005 master-0 kubenswrapper[15202]: I0319 09:53:32.680301 15202 generic.go:334] "Generic (PLEG): container finished" podID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerID="c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd" exitCode=0 Mar 19 09:53:32.690005 master-0 kubenswrapper[15202]: I0319 09:53:32.681565 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" event={"ID":"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8","Type":"ContainerDied","Data":"c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd"} Mar 19 09:53:32.690005 master-0 kubenswrapper[15202]: I0319 09:53:32.681685 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" event={"ID":"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8","Type":"ContainerDied","Data":"b6761f91cd49e2b5e0769df26952258e71b0c64735a00f1c17ae75b49f39b2f2"} Mar 19 09:53:32.690005 master-0 kubenswrapper[15202]: I0319 09:53:32.681717 15202 scope.go:117] "RemoveContainer" containerID="c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd" Mar 19 09:53:32.690005 master-0 kubenswrapper[15202]: I0319 09:53:32.681959 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6897ccd865-b6qgp" Mar 19 09:53:32.735090 master-0 kubenswrapper[15202]: I0319 09:53:32.734528 15202 scope.go:117] "RemoveContainer" containerID="46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7" Mar 19 09:53:32.739404 master-0 kubenswrapper[15202]: I0319 09:53:32.739322 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-swift-storage-0\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.739679 master-0 kubenswrapper[15202]: I0319 09:53:32.739642 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-svc\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.739728 master-0 kubenswrapper[15202]: I0319 09:53:32.739707 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-nb\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.739769 master-0 kubenswrapper[15202]: I0319 09:53:32.739738 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxtmd\" (UniqueName: \"kubernetes.io/projected/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-kube-api-access-kxtmd\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.739810 master-0 kubenswrapper[15202]: I0319 09:53:32.739775 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-b\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.740104 master-0 kubenswrapper[15202]: I0319 09:53:32.740065 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-config\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.740190 master-0 kubenswrapper[15202]: I0319 09:53:32.740118 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-sb\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.740259 master-0 kubenswrapper[15202]: I0319 09:53:32.740235 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-a\") pod \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\" (UID: \"ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8\") " Mar 19 09:53:32.750135 master-0 kubenswrapper[15202]: I0319 09:53:32.748664 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-kube-api-access-kxtmd" (OuterVolumeSpecName: "kube-api-access-kxtmd") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "kube-api-access-kxtmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:32.767988 master-0 kubenswrapper[15202]: I0319 09:53:32.767680 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 09:53:32.768345 master-0 kubenswrapper[15202]: I0319 09:53:32.768285 15202 scope.go:117] "RemoveContainer" containerID="c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd" Mar 19 09:53:32.768747 master-0 kubenswrapper[15202]: E0319 09:53:32.768712 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd\": container with ID starting with c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd not found: ID does not exist" containerID="c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd" Mar 19 09:53:32.768796 master-0 kubenswrapper[15202]: I0319 09:53:32.768748 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd"} err="failed to get container status \"c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd\": rpc error: code = NotFound desc = could not find container \"c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd\": container with ID starting with c9558fdbca71ca834a333c2a66aa4cc899d135fd43df2dd4ab9208a54eae60dd not found: ID does not exist" Mar 19 09:53:32.768796 master-0 kubenswrapper[15202]: I0319 09:53:32.768769 15202 scope.go:117] "RemoveContainer" containerID="46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7" Mar 19 09:53:32.769148 master-0 kubenswrapper[15202]: E0319 09:53:32.768973 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7\": container with ID starting with 46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7 not found: ID does not exist" containerID="46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7" Mar 19 09:53:32.769148 master-0 kubenswrapper[15202]: I0319 09:53:32.769002 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7"} err="failed to get container status \"46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7\": rpc error: code = NotFound desc = could not find container \"46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7\": container with ID starting with 46a9149c68ad7f57432415b913ec790a7e1ad2d503b728405df39baffd517be7 not found: ID does not exist" Mar 19 09:53:32.796282 master-0 kubenswrapper[15202]: I0319 09:53:32.796184 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:32.796898 master-0 kubenswrapper[15202]: I0319 09:53:32.796576 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-log" containerID="cri-o://0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87" gracePeriod=30 Mar 19 09:53:32.796898 master-0 kubenswrapper[15202]: I0319 09:53:32.796660 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-api" containerID="cri-o://53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449" gracePeriod=30 Mar 19 09:53:32.823155 master-0 kubenswrapper[15202]: I0319 09:53:32.823084 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-config" (OuterVolumeSpecName: "config") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.838551 master-0 kubenswrapper[15202]: I0319 09:53:32.838491 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.845534 master-0 kubenswrapper[15202]: I0319 09:53:32.845048 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.845534 master-0 kubenswrapper[15202]: I0319 09:53:32.845095 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxtmd\" (UniqueName: \"kubernetes.io/projected/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-kube-api-access-kxtmd\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.845534 master-0 kubenswrapper[15202]: I0319 09:53:32.845105 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.847378 master-0 kubenswrapper[15202]: I0319 09:53:32.847320 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.865099 master-0 kubenswrapper[15202]: I0319 09:53:32.865028 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.865942 master-0 kubenswrapper[15202]: I0319 09:53:32.865885 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.875122 master-0 kubenswrapper[15202]: I0319 09:53:32.875051 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.875708 master-0 kubenswrapper[15202]: I0319 09:53:32.875636 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" (UID: "ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:32.954098 master-0 kubenswrapper[15202]: I0319 09:53:32.953813 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.954098 master-0 kubenswrapper[15202]: I0319 09:53:32.953976 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.954098 master-0 kubenswrapper[15202]: I0319 09:53:32.953990 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.954098 master-0 kubenswrapper[15202]: I0319 09:53:32.954000 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.954098 master-0 kubenswrapper[15202]: I0319 09:53:32.954010 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:32.967736 master-0 kubenswrapper[15202]: I0319 09:53:32.967665 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:32.997998 master-0 kubenswrapper[15202]: I0319 09:53:32.997918 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:33.101177 master-0 kubenswrapper[15202]: I0319 09:53:33.101114 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6897ccd865-b6qgp"] Mar 19 09:53:33.119097 master-0 kubenswrapper[15202]: I0319 09:53:33.119017 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6897ccd865-b6qgp"] Mar 19 09:53:33.707503 master-0 kubenswrapper[15202]: I0319 09:53:33.706663 15202 generic.go:334] "Generic (PLEG): container finished" podID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerID="0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87" exitCode=143 Mar 19 09:53:33.707503 master-0 kubenswrapper[15202]: I0319 09:53:33.706747 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"82f5e985-3f6a-4c95-a8b1-b107ff60cf25","Type":"ContainerDied","Data":"0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87"} Mar 19 09:53:33.719752 master-0 kubenswrapper[15202]: I0319 09:53:33.719684 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-log" containerID="cri-o://c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e" gracePeriod=30 Mar 19 09:53:33.720224 master-0 kubenswrapper[15202]: I0319 09:53:33.720193 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-metadata" containerID="cri-o://2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933" gracePeriod=30 Mar 19 09:53:34.698330 master-0 kubenswrapper[15202]: I0319 09:53:34.698286 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:34.771311 master-0 kubenswrapper[15202]: I0319 09:53:34.771240 15202 generic.go:334] "Generic (PLEG): container finished" podID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerID="2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933" exitCode=0 Mar 19 09:53:34.771311 master-0 kubenswrapper[15202]: I0319 09:53:34.771294 15202 generic.go:334] "Generic (PLEG): container finished" podID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerID="c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e" exitCode=143 Mar 19 09:53:34.771867 master-0 kubenswrapper[15202]: I0319 09:53:34.771431 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:34.771867 master-0 kubenswrapper[15202]: I0319 09:53:34.771499 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0","Type":"ContainerDied","Data":"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933"} Mar 19 09:53:34.771867 master-0 kubenswrapper[15202]: I0319 09:53:34.771531 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0","Type":"ContainerDied","Data":"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e"} Mar 19 09:53:34.771867 master-0 kubenswrapper[15202]: I0319 09:53:34.771547 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0","Type":"ContainerDied","Data":"2308f591ea601dfb0a8a4e156086437d810500ff8b3de67f70f9762d7c5fe7ee"} Mar 19 09:53:34.771867 master-0 kubenswrapper[15202]: I0319 09:53:34.771571 15202 scope.go:117] "RemoveContainer" containerID="2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933" Mar 19 09:53:34.772493 master-0 kubenswrapper[15202]: I0319 09:53:34.772110 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d493f151-b19b-4399-a5cc-cf611fc5e727" containerName="nova-scheduler-scheduler" containerID="cri-o://58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" gracePeriod=30 Mar 19 09:53:34.800370 master-0 kubenswrapper[15202]: I0319 09:53:34.800320 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-nova-metadata-tls-certs\") pod \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " Mar 19 09:53:34.800746 master-0 kubenswrapper[15202]: I0319 09:53:34.800728 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-config-data\") pod \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " Mar 19 09:53:34.800997 master-0 kubenswrapper[15202]: I0319 09:53:34.800981 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-logs\") pod \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " Mar 19 09:53:34.801107 master-0 kubenswrapper[15202]: I0319 09:53:34.801093 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmkq\" (UniqueName: \"kubernetes.io/projected/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-kube-api-access-gmmkq\") pod \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " Mar 19 09:53:34.801231 master-0 kubenswrapper[15202]: I0319 09:53:34.801218 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-combined-ca-bundle\") pod \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\" (UID: \"dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0\") " Mar 19 09:53:34.801511 master-0 kubenswrapper[15202]: I0319 09:53:34.801449 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-logs" (OuterVolumeSpecName: "logs") pod "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" (UID: "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:53:34.801914 master-0 kubenswrapper[15202]: I0319 09:53:34.801898 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:34.810357 master-0 kubenswrapper[15202]: I0319 09:53:34.810286 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-kube-api-access-gmmkq" (OuterVolumeSpecName: "kube-api-access-gmmkq") pod "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" (UID: "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0"). InnerVolumeSpecName "kube-api-access-gmmkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:34.812057 master-0 kubenswrapper[15202]: I0319 09:53:34.811992 15202 scope.go:117] "RemoveContainer" containerID="c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e" Mar 19 09:53:34.835649 master-0 kubenswrapper[15202]: I0319 09:53:34.835506 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" path="/var/lib/kubelet/pods/ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8/volumes" Mar 19 09:53:34.843064 master-0 kubenswrapper[15202]: I0319 09:53:34.842992 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" (UID: "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:34.843542 master-0 kubenswrapper[15202]: I0319 09:53:34.843452 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-config-data" (OuterVolumeSpecName: "config-data") pod "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" (UID: "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:34.864388 master-0 kubenswrapper[15202]: I0319 09:53:34.864331 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" (UID: "dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:34.904136 master-0 kubenswrapper[15202]: I0319 09:53:34.903800 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:34.904136 master-0 kubenswrapper[15202]: I0319 09:53:34.903852 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmkq\" (UniqueName: \"kubernetes.io/projected/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-kube-api-access-gmmkq\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:34.904136 master-0 kubenswrapper[15202]: I0319 09:53:34.903870 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:34.904136 master-0 kubenswrapper[15202]: I0319 09:53:34.903884 15202 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:34.928308 master-0 kubenswrapper[15202]: I0319 09:53:34.928254 15202 scope.go:117] "RemoveContainer" containerID="2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933" Mar 19 09:53:34.928924 master-0 kubenswrapper[15202]: E0319 09:53:34.928884 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933\": container with ID starting with 2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933 not found: ID does not exist" containerID="2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933" Mar 19 09:53:34.928991 master-0 kubenswrapper[15202]: I0319 09:53:34.928935 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933"} err="failed to get container status \"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933\": rpc error: code = NotFound desc = could not find container \"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933\": container with ID starting with 2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933 not found: ID does not exist" Mar 19 09:53:34.928991 master-0 kubenswrapper[15202]: I0319 09:53:34.928973 15202 scope.go:117] "RemoveContainer" containerID="c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e" Mar 19 09:53:34.929921 master-0 kubenswrapper[15202]: E0319 09:53:34.929619 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e\": container with ID starting with c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e not found: ID does not exist" containerID="c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e" Mar 19 09:53:34.929921 master-0 kubenswrapper[15202]: I0319 09:53:34.929702 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e"} err="failed to get container status \"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e\": rpc error: code = NotFound desc = could not find container \"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e\": container with ID starting with c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e not found: ID does not exist" Mar 19 09:53:34.929921 master-0 kubenswrapper[15202]: I0319 09:53:34.929737 15202 scope.go:117] "RemoveContainer" containerID="2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933" Mar 19 09:53:34.930946 master-0 kubenswrapper[15202]: I0319 09:53:34.930908 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933"} err="failed to get container status \"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933\": rpc error: code = NotFound desc = could not find container \"2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933\": container with ID starting with 2e6ff7b6b3172a4b4ffac96c2240f19e700495a4bd1395efad16c3e6a75d9933 not found: ID does not exist" Mar 19 09:53:34.930946 master-0 kubenswrapper[15202]: I0319 09:53:34.930939 15202 scope.go:117] "RemoveContainer" containerID="c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e" Mar 19 09:53:34.931230 master-0 kubenswrapper[15202]: I0319 09:53:34.931201 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e"} err="failed to get container status \"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e\": rpc error: code = NotFound desc = could not find container \"c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e\": container with ID starting with c468898600ac6d2229360be524782b503e3401fdacd2026d8cb38e4e8e71619e not found: ID does not exist" Mar 19 09:53:35.133160 master-0 kubenswrapper[15202]: I0319 09:53:35.133109 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:35.151546 master-0 kubenswrapper[15202]: I0319 09:53:35.151436 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:35.170679 master-0 kubenswrapper[15202]: I0319 09:53:35.170593 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:35.171294 master-0 kubenswrapper[15202]: E0319 09:53:35.171258 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-metadata" Mar 19 09:53:35.171294 master-0 kubenswrapper[15202]: I0319 09:53:35.171283 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-metadata" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: E0319 09:53:35.171302 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-log" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: I0319 09:53:35.171310 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-log" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: E0319 09:53:35.171337 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" containerName="nova-manage" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: I0319 09:53:35.171344 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" containerName="nova-manage" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: E0319 09:53:35.171364 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerName="init" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: I0319 09:53:35.171371 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerName="init" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: E0319 09:53:35.171386 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerName="dnsmasq-dns" Mar 19 09:53:35.171426 master-0 kubenswrapper[15202]: I0319 09:53:35.171392 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerName="dnsmasq-dns" Mar 19 09:53:35.171819 master-0 kubenswrapper[15202]: I0319 09:53:35.171666 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-log" Mar 19 09:53:35.171819 master-0 kubenswrapper[15202]: I0319 09:53:35.171683 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4bbb6c-4df8-45cf-b21d-75dab94bfaa8" containerName="dnsmasq-dns" Mar 19 09:53:35.171819 master-0 kubenswrapper[15202]: I0319 09:53:35.171702 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" containerName="nova-manage" Mar 19 09:53:35.171819 master-0 kubenswrapper[15202]: I0319 09:53:35.171715 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" containerName="nova-metadata-metadata" Mar 19 09:53:35.173164 master-0 kubenswrapper[15202]: I0319 09:53:35.173128 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:35.177070 master-0 kubenswrapper[15202]: I0319 09:53:35.177022 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 09:53:35.177356 master-0 kubenswrapper[15202]: I0319 09:53:35.177321 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:53:35.186640 master-0 kubenswrapper[15202]: I0319 09:53:35.185233 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:35.312328 master-0 kubenswrapper[15202]: I0319 09:53:35.312251 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2g7\" (UniqueName: \"kubernetes.io/projected/d0bb69fd-8511-4feb-949e-3ca2388274dc-kube-api-access-tf2g7\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.312539 master-0 kubenswrapper[15202]: I0319 09:53:35.312409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-config-data\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.312539 master-0 kubenswrapper[15202]: I0319 09:53:35.312488 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.312682 master-0 kubenswrapper[15202]: I0319 09:53:35.312643 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.312812 master-0 kubenswrapper[15202]: I0319 09:53:35.312784 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0bb69fd-8511-4feb-949e-3ca2388274dc-logs\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.414889 master-0 kubenswrapper[15202]: I0319 09:53:35.414743 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0bb69fd-8511-4feb-949e-3ca2388274dc-logs\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.415137 master-0 kubenswrapper[15202]: I0319 09:53:35.414905 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2g7\" (UniqueName: \"kubernetes.io/projected/d0bb69fd-8511-4feb-949e-3ca2388274dc-kube-api-access-tf2g7\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.415137 master-0 kubenswrapper[15202]: I0319 09:53:35.414961 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-config-data\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.415248 master-0 kubenswrapper[15202]: I0319 09:53:35.415193 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.415372 master-0 kubenswrapper[15202]: I0319 09:53:35.415334 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0bb69fd-8511-4feb-949e-3ca2388274dc-logs\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.415534 master-0 kubenswrapper[15202]: I0319 09:53:35.415489 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.419539 master-0 kubenswrapper[15202]: I0319 09:53:35.418825 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-config-data\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.425376 master-0 kubenswrapper[15202]: I0319 09:53:35.425332 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.425461 master-0 kubenswrapper[15202]: I0319 09:53:35.425341 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.431326 master-0 kubenswrapper[15202]: I0319 09:53:35.431273 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2g7\" (UniqueName: \"kubernetes.io/projected/d0bb69fd-8511-4feb-949e-3ca2388274dc-kube-api-access-tf2g7\") pod \"nova-metadata-0\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " pod="openstack/nova-metadata-0" Mar 19 09:53:35.504287 master-0 kubenswrapper[15202]: I0319 09:53:35.504218 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:53:35.964244 master-0 kubenswrapper[15202]: W0319 09:53:35.964176 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0bb69fd_8511_4feb_949e_3ca2388274dc.slice/crio-b14327301e72161e1755bb74569f26906a5a4ddddd14697258bdb26b5025baad WatchSource:0}: Error finding container b14327301e72161e1755bb74569f26906a5a4ddddd14697258bdb26b5025baad: Status 404 returned error can't find the container with id b14327301e72161e1755bb74569f26906a5a4ddddd14697258bdb26b5025baad Mar 19 09:53:35.966084 master-0 kubenswrapper[15202]: I0319 09:53:35.966054 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:53:36.667272 master-0 kubenswrapper[15202]: E0319 09:53:36.667046 15202 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:53:36.668721 master-0 kubenswrapper[15202]: E0319 09:53:36.668676 15202 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:53:36.670545 master-0 kubenswrapper[15202]: E0319 09:53:36.670502 15202 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:53:36.670545 master-0 kubenswrapper[15202]: E0319 09:53:36.670536 15202 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d493f151-b19b-4399-a5cc-cf611fc5e727" containerName="nova-scheduler-scheduler" Mar 19 09:53:36.818996 master-0 kubenswrapper[15202]: I0319 09:53:36.818922 15202 generic.go:334] "Generic (PLEG): container finished" podID="499db210-7cab-4a33-99b1-3be10260b2c2" containerID="e6bc5b6305aa59c429b80b85e16d99b583ea566e1a6003529d789d35901aa80e" exitCode=0 Mar 19 09:53:36.825640 master-0 kubenswrapper[15202]: I0319 09:53:36.825569 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0" path="/var/lib/kubelet/pods/dbc9c3f9-4518-4d1a-b065-3eebf19dc8c0/volumes" Mar 19 09:53:36.826375 master-0 kubenswrapper[15202]: I0319 09:53:36.826335 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" event={"ID":"499db210-7cab-4a33-99b1-3be10260b2c2","Type":"ContainerDied","Data":"e6bc5b6305aa59c429b80b85e16d99b583ea566e1a6003529d789d35901aa80e"} Mar 19 09:53:36.826440 master-0 kubenswrapper[15202]: I0319 09:53:36.826382 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0bb69fd-8511-4feb-949e-3ca2388274dc","Type":"ContainerStarted","Data":"3ffe27364d3e76eed5acd2b2cde951376df510d6b107f537aeb7ed98e95e32a7"} Mar 19 09:53:36.826440 master-0 kubenswrapper[15202]: I0319 09:53:36.826397 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0bb69fd-8511-4feb-949e-3ca2388274dc","Type":"ContainerStarted","Data":"295e7e8891c3942e3f85dc66aae2ae55fdb84078ee0f214962c30c10d15dc2b9"} Mar 19 09:53:36.826440 master-0 kubenswrapper[15202]: I0319 09:53:36.826406 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0bb69fd-8511-4feb-949e-3ca2388274dc","Type":"ContainerStarted","Data":"b14327301e72161e1755bb74569f26906a5a4ddddd14697258bdb26b5025baad"} Mar 19 09:53:36.862226 master-0 kubenswrapper[15202]: I0319 09:53:36.862105 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.86208108 podStartE2EDuration="1.86208108s" podCreationTimestamp="2026-03-19 09:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:36.84995635 +0000 UTC m=+1734.235371166" watchObservedRunningTime="2026-03-19 09:53:36.86208108 +0000 UTC m=+1734.247495906" Mar 19 09:53:38.436774 master-0 kubenswrapper[15202]: I0319 09:53:38.436734 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:38.613606 master-0 kubenswrapper[15202]: I0319 09:53:38.613528 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-config-data\") pod \"499db210-7cab-4a33-99b1-3be10260b2c2\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " Mar 19 09:53:38.613868 master-0 kubenswrapper[15202]: I0319 09:53:38.613773 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-scripts\") pod \"499db210-7cab-4a33-99b1-3be10260b2c2\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " Mar 19 09:53:38.613868 master-0 kubenswrapper[15202]: I0319 09:53:38.613847 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-combined-ca-bundle\") pod \"499db210-7cab-4a33-99b1-3be10260b2c2\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " Mar 19 09:53:38.613976 master-0 kubenswrapper[15202]: I0319 09:53:38.613958 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fzzt\" (UniqueName: \"kubernetes.io/projected/499db210-7cab-4a33-99b1-3be10260b2c2-kube-api-access-2fzzt\") pod \"499db210-7cab-4a33-99b1-3be10260b2c2\" (UID: \"499db210-7cab-4a33-99b1-3be10260b2c2\") " Mar 19 09:53:38.622342 master-0 kubenswrapper[15202]: I0319 09:53:38.619600 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499db210-7cab-4a33-99b1-3be10260b2c2-kube-api-access-2fzzt" (OuterVolumeSpecName: "kube-api-access-2fzzt") pod "499db210-7cab-4a33-99b1-3be10260b2c2" (UID: "499db210-7cab-4a33-99b1-3be10260b2c2"). InnerVolumeSpecName "kube-api-access-2fzzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:38.622342 master-0 kubenswrapper[15202]: I0319 09:53:38.620055 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-scripts" (OuterVolumeSpecName: "scripts") pod "499db210-7cab-4a33-99b1-3be10260b2c2" (UID: "499db210-7cab-4a33-99b1-3be10260b2c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.700192 master-0 kubenswrapper[15202]: I0319 09:53:38.700064 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "499db210-7cab-4a33-99b1-3be10260b2c2" (UID: "499db210-7cab-4a33-99b1-3be10260b2c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.703754 master-0 kubenswrapper[15202]: I0319 09:53:38.703703 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-config-data" (OuterVolumeSpecName: "config-data") pod "499db210-7cab-4a33-99b1-3be10260b2c2" (UID: "499db210-7cab-4a33-99b1-3be10260b2c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.705997 master-0 kubenswrapper[15202]: I0319 09:53:38.705969 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:53:38.717182 master-0 kubenswrapper[15202]: I0319 09:53:38.716991 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.717182 master-0 kubenswrapper[15202]: I0319 09:53:38.717047 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fzzt\" (UniqueName: \"kubernetes.io/projected/499db210-7cab-4a33-99b1-3be10260b2c2-kube-api-access-2fzzt\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.717182 master-0 kubenswrapper[15202]: I0319 09:53:38.717062 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.717182 master-0 kubenswrapper[15202]: I0319 09:53:38.717072 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499db210-7cab-4a33-99b1-3be10260b2c2-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.812119 master-0 kubenswrapper[15202]: I0319 09:53:38.812070 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:53:38.818800 master-0 kubenswrapper[15202]: I0319 09:53:38.818725 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nttmh\" (UniqueName: \"kubernetes.io/projected/d493f151-b19b-4399-a5cc-cf611fc5e727-kube-api-access-nttmh\") pod \"d493f151-b19b-4399-a5cc-cf611fc5e727\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " Mar 19 09:53:38.818924 master-0 kubenswrapper[15202]: I0319 09:53:38.818872 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-combined-ca-bundle\") pod \"d493f151-b19b-4399-a5cc-cf611fc5e727\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " Mar 19 09:53:38.819865 master-0 kubenswrapper[15202]: I0319 09:53:38.818996 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-config-data\") pod \"d493f151-b19b-4399-a5cc-cf611fc5e727\" (UID: \"d493f151-b19b-4399-a5cc-cf611fc5e727\") " Mar 19 09:53:38.829976 master-0 kubenswrapper[15202]: I0319 09:53:38.829673 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d493f151-b19b-4399-a5cc-cf611fc5e727-kube-api-access-nttmh" (OuterVolumeSpecName: "kube-api-access-nttmh") pod "d493f151-b19b-4399-a5cc-cf611fc5e727" (UID: "d493f151-b19b-4399-a5cc-cf611fc5e727"). InnerVolumeSpecName "kube-api-access-nttmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:38.879510 master-0 kubenswrapper[15202]: I0319 09:53:38.878708 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-config-data" (OuterVolumeSpecName: "config-data") pod "d493f151-b19b-4399-a5cc-cf611fc5e727" (UID: "d493f151-b19b-4399-a5cc-cf611fc5e727"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.887245 master-0 kubenswrapper[15202]: I0319 09:53:38.887101 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d493f151-b19b-4399-a5cc-cf611fc5e727" (UID: "d493f151-b19b-4399-a5cc-cf611fc5e727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.892168 master-0 kubenswrapper[15202]: I0319 09:53:38.892130 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" event={"ID":"499db210-7cab-4a33-99b1-3be10260b2c2","Type":"ContainerDied","Data":"ba412efe7e3ca4d9971f84136722c6affc1bff2c8e4b512852b7658cf73ec6d3"} Mar 19 09:53:38.892168 master-0 kubenswrapper[15202]: I0319 09:53:38.892176 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba412efe7e3ca4d9971f84136722c6affc1bff2c8e4b512852b7658cf73ec6d3" Mar 19 09:53:38.892341 master-0 kubenswrapper[15202]: I0319 09:53:38.892244 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s6rxr" Mar 19 09:53:38.916107 master-0 kubenswrapper[15202]: I0319 09:53:38.915742 15202 generic.go:334] "Generic (PLEG): container finished" podID="d493f151-b19b-4399-a5cc-cf611fc5e727" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" exitCode=0 Mar 19 09:53:38.916107 master-0 kubenswrapper[15202]: I0319 09:53:38.915827 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d493f151-b19b-4399-a5cc-cf611fc5e727","Type":"ContainerDied","Data":"58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de"} Mar 19 09:53:38.916107 master-0 kubenswrapper[15202]: I0319 09:53:38.915856 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d493f151-b19b-4399-a5cc-cf611fc5e727","Type":"ContainerDied","Data":"6006a377cd6492eeb20723837f3374bc15d1af5b9cf959d152eecc4f3493b8ca"} Mar 19 09:53:38.916107 master-0 kubenswrapper[15202]: I0319 09:53:38.915875 15202 scope.go:117] "RemoveContainer" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" Mar 19 09:53:38.916107 master-0 kubenswrapper[15202]: I0319 09:53:38.916008 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.922684 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-config-data\") pod \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.922840 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-combined-ca-bundle\") pod \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.923068 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-logs\") pod \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.923156 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skkxw\" (UniqueName: \"kubernetes.io/projected/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-kube-api-access-skkxw\") pod \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\" (UID: \"82f5e985-3f6a-4c95-a8b1-b107ff60cf25\") " Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.923636 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-logs" (OuterVolumeSpecName: "logs") pod "82f5e985-3f6a-4c95-a8b1-b107ff60cf25" (UID: "82f5e985-3f6a-4c95-a8b1-b107ff60cf25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.923729 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.923745 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d493f151-b19b-4399-a5cc-cf611fc5e727-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.924599 master-0 kubenswrapper[15202]: I0319 09:53:38.923755 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nttmh\" (UniqueName: \"kubernetes.io/projected/d493f151-b19b-4399-a5cc-cf611fc5e727-kube-api-access-nttmh\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:38.930929 master-0 kubenswrapper[15202]: I0319 09:53:38.928700 15202 generic.go:334] "Generic (PLEG): container finished" podID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerID="53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449" exitCode=0 Mar 19 09:53:38.930929 master-0 kubenswrapper[15202]: I0319 09:53:38.928770 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"82f5e985-3f6a-4c95-a8b1-b107ff60cf25","Type":"ContainerDied","Data":"53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449"} Mar 19 09:53:38.930929 master-0 kubenswrapper[15202]: I0319 09:53:38.928780 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:53:38.930929 master-0 kubenswrapper[15202]: I0319 09:53:38.928801 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"82f5e985-3f6a-4c95-a8b1-b107ff60cf25","Type":"ContainerDied","Data":"b0a0b3a4a99cb97944a7aba2e47d4614990e52ae8399aeda105ddf3f668e38e9"} Mar 19 09:53:38.933417 master-0 kubenswrapper[15202]: I0319 09:53:38.933363 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-kube-api-access-skkxw" (OuterVolumeSpecName: "kube-api-access-skkxw") pod "82f5e985-3f6a-4c95-a8b1-b107ff60cf25" (UID: "82f5e985-3f6a-4c95-a8b1-b107ff60cf25"). InnerVolumeSpecName "kube-api-access-skkxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:38.962316 master-0 kubenswrapper[15202]: I0319 09:53:38.962257 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-config-data" (OuterVolumeSpecName: "config-data") pod "82f5e985-3f6a-4c95-a8b1-b107ff60cf25" (UID: "82f5e985-3f6a-4c95-a8b1-b107ff60cf25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.966343 master-0 kubenswrapper[15202]: I0319 09:53:38.966261 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82f5e985-3f6a-4c95-a8b1-b107ff60cf25" (UID: "82f5e985-3f6a-4c95-a8b1-b107ff60cf25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:38.999792 master-0 kubenswrapper[15202]: I0319 09:53:38.999711 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 09:53:39.000448 master-0 kubenswrapper[15202]: E0319 09:53:39.000381 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-api" Mar 19 09:53:39.000448 master-0 kubenswrapper[15202]: I0319 09:53:39.000410 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-api" Mar 19 09:53:39.000448 master-0 kubenswrapper[15202]: E0319 09:53:39.000430 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d493f151-b19b-4399-a5cc-cf611fc5e727" containerName="nova-scheduler-scheduler" Mar 19 09:53:39.000448 master-0 kubenswrapper[15202]: I0319 09:53:39.000439 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d493f151-b19b-4399-a5cc-cf611fc5e727" containerName="nova-scheduler-scheduler" Mar 19 09:53:39.000448 master-0 kubenswrapper[15202]: E0319 09:53:39.000451 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499db210-7cab-4a33-99b1-3be10260b2c2" containerName="nova-cell1-conductor-db-sync" Mar 19 09:53:39.000732 master-0 kubenswrapper[15202]: I0319 09:53:39.000462 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="499db210-7cab-4a33-99b1-3be10260b2c2" containerName="nova-cell1-conductor-db-sync" Mar 19 09:53:39.000732 master-0 kubenswrapper[15202]: E0319 09:53:39.000498 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-log" Mar 19 09:53:39.000732 master-0 kubenswrapper[15202]: I0319 09:53:39.000507 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-log" Mar 19 09:53:39.000933 master-0 kubenswrapper[15202]: I0319 09:53:39.000902 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="499db210-7cab-4a33-99b1-3be10260b2c2" containerName="nova-cell1-conductor-db-sync" Mar 19 09:53:39.000979 master-0 kubenswrapper[15202]: I0319 09:53:39.000932 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-api" Mar 19 09:53:39.000979 master-0 kubenswrapper[15202]: I0319 09:53:39.000948 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="d493f151-b19b-4399-a5cc-cf611fc5e727" containerName="nova-scheduler-scheduler" Mar 19 09:53:39.000979 master-0 kubenswrapper[15202]: I0319 09:53:39.000957 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" containerName="nova-api-log" Mar 19 09:53:39.003661 master-0 kubenswrapper[15202]: I0319 09:53:39.001832 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.009533 master-0 kubenswrapper[15202]: I0319 09:53:39.009459 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 09:53:39.033742 master-0 kubenswrapper[15202]: I0319 09:53:39.026444 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:39.033742 master-0 kubenswrapper[15202]: I0319 09:53:39.026519 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skkxw\" (UniqueName: \"kubernetes.io/projected/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-kube-api-access-skkxw\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:39.033742 master-0 kubenswrapper[15202]: I0319 09:53:39.026532 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:39.033742 master-0 kubenswrapper[15202]: I0319 09:53:39.026543 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f5e985-3f6a-4c95-a8b1-b107ff60cf25-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:39.038888 master-0 kubenswrapper[15202]: I0319 09:53:39.038593 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 09:53:39.063732 master-0 kubenswrapper[15202]: I0319 09:53:39.063691 15202 scope.go:117] "RemoveContainer" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" Mar 19 09:53:39.064254 master-0 kubenswrapper[15202]: E0319 09:53:39.064221 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de\": container with ID starting with 58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de not found: ID does not exist" containerID="58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de" Mar 19 09:53:39.064318 master-0 kubenswrapper[15202]: I0319 09:53:39.064256 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de"} err="failed to get container status \"58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de\": rpc error: code = NotFound desc = could not find container \"58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de\": container with ID starting with 58d865af8a189a95f955a7ba435703b3e7f24266b3757b042527560d0a5c92de not found: ID does not exist" Mar 19 09:53:39.064318 master-0 kubenswrapper[15202]: I0319 09:53:39.064278 15202 scope.go:117] "RemoveContainer" containerID="53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449" Mar 19 09:53:39.088034 master-0 kubenswrapper[15202]: I0319 09:53:39.087274 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:39.123496 master-0 kubenswrapper[15202]: I0319 09:53:39.107340 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:39.123496 master-0 kubenswrapper[15202]: I0319 09:53:39.110939 15202 scope.go:117] "RemoveContainer" containerID="0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87" Mar 19 09:53:39.126659 master-0 kubenswrapper[15202]: I0319 09:53:39.126609 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:39.128573 master-0 kubenswrapper[15202]: I0319 09:53:39.128543 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:53:39.129730 master-0 kubenswrapper[15202]: I0319 09:53:39.129638 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.129898 master-0 kubenswrapper[15202]: I0319 09:53:39.129863 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.129996 master-0 kubenswrapper[15202]: I0319 09:53:39.129964 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv68\" (UniqueName: \"kubernetes.io/projected/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-kube-api-access-6xv68\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.130667 master-0 kubenswrapper[15202]: I0319 09:53:39.130631 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 09:53:39.135789 master-0 kubenswrapper[15202]: I0319 09:53:39.135741 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:39.154668 master-0 kubenswrapper[15202]: I0319 09:53:39.154617 15202 scope.go:117] "RemoveContainer" containerID="53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449" Mar 19 09:53:39.155383 master-0 kubenswrapper[15202]: E0319 09:53:39.155340 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449\": container with ID starting with 53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449 not found: ID does not exist" containerID="53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449" Mar 19 09:53:39.155434 master-0 kubenswrapper[15202]: I0319 09:53:39.155387 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449"} err="failed to get container status \"53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449\": rpc error: code = NotFound desc = could not find container \"53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449\": container with ID starting with 53380592a5261d8c9431e344ef63865cf4427f21622ace2ff1860ff5e8cc2449 not found: ID does not exist" Mar 19 09:53:39.155434 master-0 kubenswrapper[15202]: I0319 09:53:39.155412 15202 scope.go:117] "RemoveContainer" containerID="0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87" Mar 19 09:53:39.155979 master-0 kubenswrapper[15202]: E0319 09:53:39.155950 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87\": container with ID starting with 0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87 not found: ID does not exist" containerID="0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87" Mar 19 09:53:39.156086 master-0 kubenswrapper[15202]: I0319 09:53:39.155992 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87"} err="failed to get container status \"0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87\": rpc error: code = NotFound desc = could not find container \"0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87\": container with ID starting with 0ffa9c7c9ed10600960fca8e1e5867d188b22a677b616a1543a508a59dc07b87 not found: ID does not exist" Mar 19 09:53:39.232029 master-0 kubenswrapper[15202]: I0319 09:53:39.231902 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.232029 master-0 kubenswrapper[15202]: I0319 09:53:39.231989 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.232279 master-0 kubenswrapper[15202]: I0319 09:53:39.232047 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rstf\" (UniqueName: \"kubernetes.io/projected/657520fd-77a9-49cd-a2c0-5b3f9da06c59-kube-api-access-9rstf\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.232279 master-0 kubenswrapper[15202]: I0319 09:53:39.232209 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-config-data\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.232373 master-0 kubenswrapper[15202]: I0319 09:53:39.232291 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.232501 master-0 kubenswrapper[15202]: I0319 09:53:39.232447 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv68\" (UniqueName: \"kubernetes.io/projected/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-kube-api-access-6xv68\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.238145 master-0 kubenswrapper[15202]: I0319 09:53:39.237115 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.238449 master-0 kubenswrapper[15202]: I0319 09:53:39.238398 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.256280 master-0 kubenswrapper[15202]: I0319 09:53:39.256193 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv68\" (UniqueName: \"kubernetes.io/projected/8816b584-8f92-4ac4-93f3-fcd86f5e64a2-kube-api-access-6xv68\") pod \"nova-cell1-conductor-0\" (UID: \"8816b584-8f92-4ac4-93f3-fcd86f5e64a2\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.335831 master-0 kubenswrapper[15202]: I0319 09:53:39.335779 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.336002 master-0 kubenswrapper[15202]: I0319 09:53:39.335974 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rstf\" (UniqueName: \"kubernetes.io/projected/657520fd-77a9-49cd-a2c0-5b3f9da06c59-kube-api-access-9rstf\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.336073 master-0 kubenswrapper[15202]: I0319 09:53:39.336050 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-config-data\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.339840 master-0 kubenswrapper[15202]: I0319 09:53:39.339783 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.340712 master-0 kubenswrapper[15202]: I0319 09:53:39.340677 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-config-data\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.357328 master-0 kubenswrapper[15202]: I0319 09:53:39.357261 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:39.364315 master-0 kubenswrapper[15202]: I0319 09:53:39.363281 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rstf\" (UniqueName: \"kubernetes.io/projected/657520fd-77a9-49cd-a2c0-5b3f9da06c59-kube-api-access-9rstf\") pod \"nova-scheduler-0\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " pod="openstack/nova-scheduler-0" Mar 19 09:53:39.376231 master-0 kubenswrapper[15202]: I0319 09:53:39.376171 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:39.398359 master-0 kubenswrapper[15202]: I0319 09:53:39.398304 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:39.410897 master-0 kubenswrapper[15202]: I0319 09:53:39.410839 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:39.413817 master-0 kubenswrapper[15202]: I0319 09:53:39.413782 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:53:39.419232 master-0 kubenswrapper[15202]: I0319 09:53:39.419184 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:53:39.453203 master-0 kubenswrapper[15202]: I0319 09:53:39.453117 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:53:39.457188 master-0 kubenswrapper[15202]: I0319 09:53:39.457121 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:39.567444 master-0 kubenswrapper[15202]: I0319 09:53:39.567377 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9062c234-88ef-4c57-8370-5183a2c5b88e-logs\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.567444 master-0 kubenswrapper[15202]: I0319 09:53:39.567438 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.567721 master-0 kubenswrapper[15202]: I0319 09:53:39.567568 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-config-data\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.570899 master-0 kubenswrapper[15202]: I0319 09:53:39.570836 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6hf\" (UniqueName: \"kubernetes.io/projected/9062c234-88ef-4c57-8370-5183a2c5b88e-kube-api-access-dj6hf\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.674593 master-0 kubenswrapper[15202]: I0319 09:53:39.673547 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9062c234-88ef-4c57-8370-5183a2c5b88e-logs\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.674593 master-0 kubenswrapper[15202]: I0319 09:53:39.673966 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.674593 master-0 kubenswrapper[15202]: I0319 09:53:39.673926 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9062c234-88ef-4c57-8370-5183a2c5b88e-logs\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.674593 master-0 kubenswrapper[15202]: I0319 09:53:39.674529 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-config-data\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.674593 master-0 kubenswrapper[15202]: I0319 09:53:39.674604 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj6hf\" (UniqueName: \"kubernetes.io/projected/9062c234-88ef-4c57-8370-5183a2c5b88e-kube-api-access-dj6hf\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.678287 master-0 kubenswrapper[15202]: I0319 09:53:39.678217 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-config-data\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.684271 master-0 kubenswrapper[15202]: I0319 09:53:39.684222 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.690191 master-0 kubenswrapper[15202]: I0319 09:53:39.690155 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj6hf\" (UniqueName: \"kubernetes.io/projected/9062c234-88ef-4c57-8370-5183a2c5b88e-kube-api-access-dj6hf\") pod \"nova-api-0\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " pod="openstack/nova-api-0" Mar 19 09:53:39.770985 master-0 kubenswrapper[15202]: I0319 09:53:39.770934 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:53:39.984572 master-0 kubenswrapper[15202]: I0319 09:53:39.984503 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 09:53:40.092121 master-0 kubenswrapper[15202]: W0319 09:53:40.091785 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod657520fd_77a9_49cd_a2c0_5b3f9da06c59.slice/crio-7591e59f39d89f026746cc3ed63c9b9d05d8e8f98f7ca44c6eb3d8233aec84c5 WatchSource:0}: Error finding container 7591e59f39d89f026746cc3ed63c9b9d05d8e8f98f7ca44c6eb3d8233aec84c5: Status 404 returned error can't find the container with id 7591e59f39d89f026746cc3ed63c9b9d05d8e8f98f7ca44c6eb3d8233aec84c5 Mar 19 09:53:40.117094 master-0 kubenswrapper[15202]: I0319 09:53:40.117030 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:53:40.362825 master-0 kubenswrapper[15202]: I0319 09:53:40.362156 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:53:40.840500 master-0 kubenswrapper[15202]: I0319 09:53:40.840370 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f5e985-3f6a-4c95-a8b1-b107ff60cf25" path="/var/lib/kubelet/pods/82f5e985-3f6a-4c95-a8b1-b107ff60cf25/volumes" Mar 19 09:53:40.841636 master-0 kubenswrapper[15202]: I0319 09:53:40.841618 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d493f151-b19b-4399-a5cc-cf611fc5e727" path="/var/lib/kubelet/pods/d493f151-b19b-4399-a5cc-cf611fc5e727/volumes" Mar 19 09:53:40.996308 master-0 kubenswrapper[15202]: I0319 09:53:40.996246 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9062c234-88ef-4c57-8370-5183a2c5b88e","Type":"ContainerStarted","Data":"169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8"} Mar 19 09:53:40.996308 master-0 kubenswrapper[15202]: I0319 09:53:40.996300 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9062c234-88ef-4c57-8370-5183a2c5b88e","Type":"ContainerStarted","Data":"8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d"} Mar 19 09:53:40.996308 master-0 kubenswrapper[15202]: I0319 09:53:40.996311 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9062c234-88ef-4c57-8370-5183a2c5b88e","Type":"ContainerStarted","Data":"b1e59938d89b3000e1458ff6c41b457a918950218ca9bc31dce46c325f18c06e"} Mar 19 09:53:40.999323 master-0 kubenswrapper[15202]: I0319 09:53:40.999043 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8816b584-8f92-4ac4-93f3-fcd86f5e64a2","Type":"ContainerStarted","Data":"a12fd45be2101eb0c87381eb130a0db01a1f06beae7b9d42e7bad6d11c4f4aa1"} Mar 19 09:53:40.999323 master-0 kubenswrapper[15202]: I0319 09:53:40.999141 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8816b584-8f92-4ac4-93f3-fcd86f5e64a2","Type":"ContainerStarted","Data":"a8ddd448acdf621fe0c2d12f969a1e9567de3d0367ab77b98676f7f993a3e9da"} Mar 19 09:53:40.999323 master-0 kubenswrapper[15202]: I0319 09:53:40.999223 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:41.001619 master-0 kubenswrapper[15202]: I0319 09:53:41.000997 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"657520fd-77a9-49cd-a2c0-5b3f9da06c59","Type":"ContainerStarted","Data":"ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5"} Mar 19 09:53:41.001619 master-0 kubenswrapper[15202]: I0319 09:53:41.001036 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"657520fd-77a9-49cd-a2c0-5b3f9da06c59","Type":"ContainerStarted","Data":"7591e59f39d89f026746cc3ed63c9b9d05d8e8f98f7ca44c6eb3d8233aec84c5"} Mar 19 09:53:41.041685 master-0 kubenswrapper[15202]: I0319 09:53:41.041558 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.041492299 podStartE2EDuration="2.041492299s" podCreationTimestamp="2026-03-19 09:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:41.025567035 +0000 UTC m=+1738.410981861" watchObservedRunningTime="2026-03-19 09:53:41.041492299 +0000 UTC m=+1738.426907115" Mar 19 09:53:41.054750 master-0 kubenswrapper[15202]: I0319 09:53:41.054686 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.054615964 podStartE2EDuration="2.054615964s" podCreationTimestamp="2026-03-19 09:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:41.049543409 +0000 UTC m=+1738.434958235" watchObservedRunningTime="2026-03-19 09:53:41.054615964 +0000 UTC m=+1738.440030780" Mar 19 09:53:41.079933 master-0 kubenswrapper[15202]: I0319 09:53:41.079856 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.079836658 podStartE2EDuration="3.079836658s" podCreationTimestamp="2026-03-19 09:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:41.070648621 +0000 UTC m=+1738.456063457" watchObservedRunningTime="2026-03-19 09:53:41.079836658 +0000 UTC m=+1738.465251474" Mar 19 09:53:44.454762 master-0 kubenswrapper[15202]: I0319 09:53:44.454672 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 09:53:45.506242 master-0 kubenswrapper[15202]: I0319 09:53:45.505344 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:53:45.506242 master-0 kubenswrapper[15202]: I0319 09:53:45.505444 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:53:46.520822 master-0 kubenswrapper[15202]: I0319 09:53:46.520741 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.7:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:53:46.521498 master-0 kubenswrapper[15202]: I0319 09:53:46.520749 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.7:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:53:49.393511 master-0 kubenswrapper[15202]: I0319 09:53:49.393235 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 09:53:49.454200 master-0 kubenswrapper[15202]: I0319 09:53:49.454114 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 09:53:49.537247 master-0 kubenswrapper[15202]: I0319 09:53:49.537191 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 09:53:49.771738 master-0 kubenswrapper[15202]: I0319 09:53:49.771645 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:53:49.771738 master-0 kubenswrapper[15202]: I0319 09:53:49.771749 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:53:50.156970 master-0 kubenswrapper[15202]: I0319 09:53:50.156831 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 09:53:50.853789 master-0 kubenswrapper[15202]: I0319 09:53:50.853714 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:53:50.853789 master-0 kubenswrapper[15202]: I0319 09:53:50.853727 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:53:53.504504 master-0 kubenswrapper[15202]: I0319 09:53:53.504390 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:53:53.505193 master-0 kubenswrapper[15202]: I0319 09:53:53.504563 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:53:55.546843 master-0 kubenswrapper[15202]: I0319 09:53:55.546778 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:53:55.547354 master-0 kubenswrapper[15202]: I0319 09:53:55.546872 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:53:55.554826 master-0 kubenswrapper[15202]: I0319 09:53:55.554772 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:53:55.555174 master-0 kubenswrapper[15202]: I0319 09:53:55.555135 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:53:57.056094 master-0 kubenswrapper[15202]: I0319 09:53:57.055981 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:57.121750 master-0 kubenswrapper[15202]: I0319 09:53:57.121670 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-combined-ca-bundle\") pod \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " Mar 19 09:53:57.122050 master-0 kubenswrapper[15202]: I0319 09:53:57.121789 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhgpz\" (UniqueName: \"kubernetes.io/projected/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-kube-api-access-jhgpz\") pod \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " Mar 19 09:53:57.122332 master-0 kubenswrapper[15202]: I0319 09:53:57.122241 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-config-data\") pod \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\" (UID: \"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0\") " Mar 19 09:53:57.125484 master-0 kubenswrapper[15202]: I0319 09:53:57.125399 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-kube-api-access-jhgpz" (OuterVolumeSpecName: "kube-api-access-jhgpz") pod "c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" (UID: "c0c57d6e-0358-487a-8b9b-9a6399cbdfb0"). InnerVolumeSpecName "kube-api-access-jhgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:57.150859 master-0 kubenswrapper[15202]: I0319 09:53:57.150590 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-config-data" (OuterVolumeSpecName: "config-data") pod "c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" (UID: "c0c57d6e-0358-487a-8b9b-9a6399cbdfb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:57.151671 master-0 kubenswrapper[15202]: I0319 09:53:57.151615 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" (UID: "c0c57d6e-0358-487a-8b9b-9a6399cbdfb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:57.218087 master-0 kubenswrapper[15202]: I0319 09:53:57.218035 15202 generic.go:334] "Generic (PLEG): container finished" podID="c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" containerID="bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21" exitCode=137 Mar 19 09:53:57.218195 master-0 kubenswrapper[15202]: I0319 09:53:57.218097 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:53:57.218195 master-0 kubenswrapper[15202]: I0319 09:53:57.218128 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0","Type":"ContainerDied","Data":"bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21"} Mar 19 09:53:57.218264 master-0 kubenswrapper[15202]: I0319 09:53:57.218213 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c0c57d6e-0358-487a-8b9b-9a6399cbdfb0","Type":"ContainerDied","Data":"2a476d0642f428b76d69459d9717142cafe7dfddfb486340d2baf04bf1d2d260"} Mar 19 09:53:57.218264 master-0 kubenswrapper[15202]: I0319 09:53:57.218234 15202 scope.go:117] "RemoveContainer" containerID="bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21" Mar 19 09:53:57.225682 master-0 kubenswrapper[15202]: I0319 09:53:57.225616 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:57.225682 master-0 kubenswrapper[15202]: I0319 09:53:57.225679 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:57.225824 master-0 kubenswrapper[15202]: I0319 09:53:57.225692 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhgpz\" (UniqueName: \"kubernetes.io/projected/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0-kube-api-access-jhgpz\") on node \"master-0\" DevicePath \"\"" Mar 19 09:53:57.238418 master-0 kubenswrapper[15202]: I0319 09:53:57.238375 15202 scope.go:117] "RemoveContainer" containerID="bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21" Mar 19 09:53:57.238889 master-0 kubenswrapper[15202]: E0319 09:53:57.238855 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21\": container with ID starting with bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21 not found: ID does not exist" containerID="bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21" Mar 19 09:53:57.238948 master-0 kubenswrapper[15202]: I0319 09:53:57.238893 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21"} err="failed to get container status \"bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21\": rpc error: code = NotFound desc = could not find container \"bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21\": container with ID starting with bf0e7da38a763fb3d5b97b73117fc7519564aa5858e3b44c3a7500e7f9ce9d21 not found: ID does not exist" Mar 19 09:53:57.771641 master-0 kubenswrapper[15202]: I0319 09:53:57.771525 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:53:57.771641 master-0 kubenswrapper[15202]: I0319 09:53:57.771622 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:53:59.775641 master-0 kubenswrapper[15202]: I0319 09:53:59.775594 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:53:59.776590 master-0 kubenswrapper[15202]: I0319 09:53:59.776545 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:53:59.778815 master-0 kubenswrapper[15202]: I0319 09:53:59.778786 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:54:00.258745 master-0 kubenswrapper[15202]: I0319 09:54:00.258691 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:54:03.319715 master-0 kubenswrapper[15202]: I0319 09:54:03.319654 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:54:03.336738 master-0 kubenswrapper[15202]: I0319 09:54:03.336678 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:54:04.070830 master-0 kubenswrapper[15202]: I0319 09:54:04.070759 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:54:04.071461 master-0 kubenswrapper[15202]: E0319 09:54:04.071427 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 09:54:04.071568 master-0 kubenswrapper[15202]: I0319 09:54:04.071480 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 09:54:04.071827 master-0 kubenswrapper[15202]: I0319 09:54:04.071794 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 09:54:04.072818 master-0 kubenswrapper[15202]: I0319 09:54:04.072787 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.080392 master-0 kubenswrapper[15202]: I0319 09:54:04.080345 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 09:54:04.080960 master-0 kubenswrapper[15202]: I0319 09:54:04.080894 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 09:54:04.081065 master-0 kubenswrapper[15202]: I0319 09:54:04.080943 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 09:54:04.159776 master-0 kubenswrapper[15202]: I0319 09:54:04.159594 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:54:04.204842 master-0 kubenswrapper[15202]: I0319 09:54:04.204790 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.205229 master-0 kubenswrapper[15202]: I0319 09:54:04.205157 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.205429 master-0 kubenswrapper[15202]: I0319 09:54:04.205413 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.205753 master-0 kubenswrapper[15202]: I0319 09:54:04.205737 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.205950 master-0 kubenswrapper[15202]: I0319 09:54:04.205935 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzkr2\" (UniqueName: \"kubernetes.io/projected/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-kube-api-access-jzkr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.309169 master-0 kubenswrapper[15202]: I0319 09:54:04.308793 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.309169 master-0 kubenswrapper[15202]: I0319 09:54:04.308873 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.309169 master-0 kubenswrapper[15202]: I0319 09:54:04.308922 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzkr2\" (UniqueName: \"kubernetes.io/projected/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-kube-api-access-jzkr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.309169 master-0 kubenswrapper[15202]: I0319 09:54:04.309054 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.309169 master-0 kubenswrapper[15202]: I0319 09:54:04.309164 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.315456 master-0 kubenswrapper[15202]: I0319 09:54:04.315403 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.321003 master-0 kubenswrapper[15202]: I0319 09:54:04.320499 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.323971 master-0 kubenswrapper[15202]: I0319 09:54:04.323870 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.327166 master-0 kubenswrapper[15202]: I0319 09:54:04.327145 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.374016 master-0 kubenswrapper[15202]: I0319 09:54:04.370112 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzkr2\" (UniqueName: \"kubernetes.io/projected/2ca1bcf5-5b69-4fac-91c0-af03f6f99980-kube-api-access-jzkr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ca1bcf5-5b69-4fac-91c0-af03f6f99980\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.422071 master-0 kubenswrapper[15202]: I0319 09:54:04.413612 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5687765f45-jhnth"] Mar 19 09:54:04.442606 master-0 kubenswrapper[15202]: I0319 09:54:04.440350 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.447964 master-0 kubenswrapper[15202]: I0319 09:54:04.444846 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:04.502527 master-0 kubenswrapper[15202]: I0319 09:54:04.502482 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5687765f45-jhnth"] Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.515515 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-config\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.515950 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-edpm-a\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.515987 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-dns-swift-storage-0\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.516037 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-ovsdbserver-sb\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.516181 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5lw\" (UniqueName: \"kubernetes.io/projected/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-kube-api-access-cm5lw\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.516308 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-ovsdbserver-nb\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.516382 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-edpm-b\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.519947 master-0 kubenswrapper[15202]: I0319 09:54:04.516427 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-dns-svc\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618366 master-0 kubenswrapper[15202]: I0319 09:54:04.618196 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-config\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618366 master-0 kubenswrapper[15202]: I0319 09:54:04.618268 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-edpm-a\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618366 master-0 kubenswrapper[15202]: I0319 09:54:04.618295 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-dns-swift-storage-0\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618366 master-0 kubenswrapper[15202]: I0319 09:54:04.618329 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-ovsdbserver-sb\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618765 master-0 kubenswrapper[15202]: I0319 09:54:04.618409 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5lw\" (UniqueName: \"kubernetes.io/projected/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-kube-api-access-cm5lw\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618765 master-0 kubenswrapper[15202]: I0319 09:54:04.618499 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-ovsdbserver-nb\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618765 master-0 kubenswrapper[15202]: I0319 09:54:04.618547 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-edpm-b\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.618765 master-0 kubenswrapper[15202]: I0319 09:54:04.618566 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-dns-svc\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.620037 master-0 kubenswrapper[15202]: I0319 09:54:04.620010 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-ovsdbserver-sb\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.620913 master-0 kubenswrapper[15202]: I0319 09:54:04.620868 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-config\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.621114 master-0 kubenswrapper[15202]: I0319 09:54:04.621089 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-dns-svc\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.621639 master-0 kubenswrapper[15202]: I0319 09:54:04.621565 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-edpm-b\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.621869 master-0 kubenswrapper[15202]: I0319 09:54:04.621827 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-dns-swift-storage-0\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.622305 master-0 kubenswrapper[15202]: I0319 09:54:04.622272 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-edpm-a\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.623429 master-0 kubenswrapper[15202]: I0319 09:54:04.623390 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-ovsdbserver-nb\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.647634 master-0 kubenswrapper[15202]: I0319 09:54:04.643460 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5lw\" (UniqueName: \"kubernetes.io/projected/eef5b369-ca6a-4da9-a54b-4d2cf46e4328-kube-api-access-cm5lw\") pod \"dnsmasq-dns-5687765f45-jhnth\" (UID: \"eef5b369-ca6a-4da9-a54b-4d2cf46e4328\") " pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.778508 master-0 kubenswrapper[15202]: I0319 09:54:04.776179 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:04.857901 master-0 kubenswrapper[15202]: I0319 09:54:04.857784 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c57d6e-0358-487a-8b9b-9a6399cbdfb0" path="/var/lib/kubelet/pods/c0c57d6e-0358-487a-8b9b-9a6399cbdfb0/volumes" Mar 19 09:54:05.172908 master-0 kubenswrapper[15202]: W0319 09:54:05.172848 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca1bcf5_5b69_4fac_91c0_af03f6f99980.slice/crio-94d71d0e4f615cded82c2c8e4902e873826e5b9ac642c707b8e6e6d1d303464e WatchSource:0}: Error finding container 94d71d0e4f615cded82c2c8e4902e873826e5b9ac642c707b8e6e6d1d303464e: Status 404 returned error can't find the container with id 94d71d0e4f615cded82c2c8e4902e873826e5b9ac642c707b8e6e6d1d303464e Mar 19 09:54:05.203287 master-0 kubenswrapper[15202]: I0319 09:54:05.198624 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:54:05.362954 master-0 kubenswrapper[15202]: I0319 09:54:05.362857 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ca1bcf5-5b69-4fac-91c0-af03f6f99980","Type":"ContainerStarted","Data":"94d71d0e4f615cded82c2c8e4902e873826e5b9ac642c707b8e6e6d1d303464e"} Mar 19 09:54:05.440617 master-0 kubenswrapper[15202]: I0319 09:54:05.438518 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5687765f45-jhnth"] Mar 19 09:54:06.380232 master-0 kubenswrapper[15202]: I0319 09:54:06.380178 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ca1bcf5-5b69-4fac-91c0-af03f6f99980","Type":"ContainerStarted","Data":"f90b1f58a3431f98850084df53145ef90e7c32077e72d33e01c9fea66e689a18"} Mar 19 09:54:06.388002 master-0 kubenswrapper[15202]: I0319 09:54:06.387936 15202 generic.go:334] "Generic (PLEG): container finished" podID="eef5b369-ca6a-4da9-a54b-4d2cf46e4328" containerID="80f432bc52ce8ad9bf6a651784a4c0b2da01ca03a91cec32f63218db1c3aa043" exitCode=0 Mar 19 09:54:06.388002 master-0 kubenswrapper[15202]: I0319 09:54:06.387996 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687765f45-jhnth" event={"ID":"eef5b369-ca6a-4da9-a54b-4d2cf46e4328","Type":"ContainerDied","Data":"80f432bc52ce8ad9bf6a651784a4c0b2da01ca03a91cec32f63218db1c3aa043"} Mar 19 09:54:06.388225 master-0 kubenswrapper[15202]: I0319 09:54:06.388047 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687765f45-jhnth" event={"ID":"eef5b369-ca6a-4da9-a54b-4d2cf46e4328","Type":"ContainerStarted","Data":"fb4344da1d3244694ae0f520053910f658910753517093618d8d154fb09533e9"} Mar 19 09:54:06.408228 master-0 kubenswrapper[15202]: I0319 09:54:06.408145 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.408125701 podStartE2EDuration="3.408125701s" podCreationTimestamp="2026-03-19 09:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:06.404028469 +0000 UTC m=+1763.789443305" watchObservedRunningTime="2026-03-19 09:54:06.408125701 +0000 UTC m=+1763.793540517" Mar 19 09:54:07.228949 master-0 kubenswrapper[15202]: I0319 09:54:07.228857 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:07.229256 master-0 kubenswrapper[15202]: I0319 09:54:07.229129 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-log" containerID="cri-o://8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d" gracePeriod=30 Mar 19 09:54:07.229888 master-0 kubenswrapper[15202]: I0319 09:54:07.229707 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-api" containerID="cri-o://169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8" gracePeriod=30 Mar 19 09:54:07.401782 master-0 kubenswrapper[15202]: I0319 09:54:07.401655 15202 generic.go:334] "Generic (PLEG): container finished" podID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerID="8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d" exitCode=143 Mar 19 09:54:07.401782 master-0 kubenswrapper[15202]: I0319 09:54:07.401768 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9062c234-88ef-4c57-8370-5183a2c5b88e","Type":"ContainerDied","Data":"8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d"} Mar 19 09:54:07.406630 master-0 kubenswrapper[15202]: I0319 09:54:07.406595 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687765f45-jhnth" event={"ID":"eef5b369-ca6a-4da9-a54b-4d2cf46e4328","Type":"ContainerStarted","Data":"3962eafb2f12d2b7ff2734e55207b2dd18934f1e2921d880ce38432493d07490"} Mar 19 09:54:07.406725 master-0 kubenswrapper[15202]: I0319 09:54:07.406646 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:09.446901 master-0 kubenswrapper[15202]: I0319 09:54:09.446812 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:10.926605 master-0 kubenswrapper[15202]: I0319 09:54:10.926555 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:10.961206 master-0 kubenswrapper[15202]: I0319 09:54:10.961106 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5687765f45-jhnth" podStartSLOduration=6.961081678 podStartE2EDuration="6.961081678s" podCreationTimestamp="2026-03-19 09:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:07.442760565 +0000 UTC m=+1764.828175381" watchObservedRunningTime="2026-03-19 09:54:10.961081678 +0000 UTC m=+1768.346496494" Mar 19 09:54:11.042577 master-0 kubenswrapper[15202]: I0319 09:54:11.042488 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj6hf\" (UniqueName: \"kubernetes.io/projected/9062c234-88ef-4c57-8370-5183a2c5b88e-kube-api-access-dj6hf\") pod \"9062c234-88ef-4c57-8370-5183a2c5b88e\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " Mar 19 09:54:11.042577 master-0 kubenswrapper[15202]: I0319 09:54:11.042580 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-config-data\") pod \"9062c234-88ef-4c57-8370-5183a2c5b88e\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " Mar 19 09:54:11.042812 master-0 kubenswrapper[15202]: I0319 09:54:11.042792 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-combined-ca-bundle\") pod \"9062c234-88ef-4c57-8370-5183a2c5b88e\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " Mar 19 09:54:11.042884 master-0 kubenswrapper[15202]: I0319 09:54:11.042870 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9062c234-88ef-4c57-8370-5183a2c5b88e-logs\") pod \"9062c234-88ef-4c57-8370-5183a2c5b88e\" (UID: \"9062c234-88ef-4c57-8370-5183a2c5b88e\") " Mar 19 09:54:11.046520 master-0 kubenswrapper[15202]: I0319 09:54:11.046461 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9062c234-88ef-4c57-8370-5183a2c5b88e-kube-api-access-dj6hf" (OuterVolumeSpecName: "kube-api-access-dj6hf") pod "9062c234-88ef-4c57-8370-5183a2c5b88e" (UID: "9062c234-88ef-4c57-8370-5183a2c5b88e"). InnerVolumeSpecName "kube-api-access-dj6hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:11.046851 master-0 kubenswrapper[15202]: I0319 09:54:11.046821 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9062c234-88ef-4c57-8370-5183a2c5b88e-logs" (OuterVolumeSpecName: "logs") pod "9062c234-88ef-4c57-8370-5183a2c5b88e" (UID: "9062c234-88ef-4c57-8370-5183a2c5b88e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:54:11.100135 master-0 kubenswrapper[15202]: I0319 09:54:11.100071 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9062c234-88ef-4c57-8370-5183a2c5b88e" (UID: "9062c234-88ef-4c57-8370-5183a2c5b88e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:11.126008 master-0 kubenswrapper[15202]: I0319 09:54:11.125939 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-config-data" (OuterVolumeSpecName: "config-data") pod "9062c234-88ef-4c57-8370-5183a2c5b88e" (UID: "9062c234-88ef-4c57-8370-5183a2c5b88e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:11.145762 master-0 kubenswrapper[15202]: I0319 09:54:11.145709 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj6hf\" (UniqueName: \"kubernetes.io/projected/9062c234-88ef-4c57-8370-5183a2c5b88e-kube-api-access-dj6hf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:11.145762 master-0 kubenswrapper[15202]: I0319 09:54:11.145761 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:11.145960 master-0 kubenswrapper[15202]: I0319 09:54:11.145775 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062c234-88ef-4c57-8370-5183a2c5b88e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:11.145960 master-0 kubenswrapper[15202]: I0319 09:54:11.145787 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9062c234-88ef-4c57-8370-5183a2c5b88e-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:11.450230 master-0 kubenswrapper[15202]: I0319 09:54:11.450174 15202 generic.go:334] "Generic (PLEG): container finished" podID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerID="169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8" exitCode=0 Mar 19 09:54:11.450230 master-0 kubenswrapper[15202]: I0319 09:54:11.450227 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9062c234-88ef-4c57-8370-5183a2c5b88e","Type":"ContainerDied","Data":"169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8"} Mar 19 09:54:11.450490 master-0 kubenswrapper[15202]: I0319 09:54:11.450256 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9062c234-88ef-4c57-8370-5183a2c5b88e","Type":"ContainerDied","Data":"b1e59938d89b3000e1458ff6c41b457a918950218ca9bc31dce46c325f18c06e"} Mar 19 09:54:11.450490 master-0 kubenswrapper[15202]: I0319 09:54:11.450274 15202 scope.go:117] "RemoveContainer" containerID="169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8" Mar 19 09:54:11.450490 master-0 kubenswrapper[15202]: I0319 09:54:11.450395 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:11.475778 master-0 kubenswrapper[15202]: I0319 09:54:11.475438 15202 scope.go:117] "RemoveContainer" containerID="8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d" Mar 19 09:54:11.498669 master-0 kubenswrapper[15202]: I0319 09:54:11.498604 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:11.522074 master-0 kubenswrapper[15202]: I0319 09:54:11.508311 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:11.522074 master-0 kubenswrapper[15202]: I0319 09:54:11.517144 15202 scope.go:117] "RemoveContainer" containerID="169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8" Mar 19 09:54:11.528214 master-0 kubenswrapper[15202]: E0319 09:54:11.528136 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8\": container with ID starting with 169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8 not found: ID does not exist" containerID="169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8" Mar 19 09:54:11.528319 master-0 kubenswrapper[15202]: I0319 09:54:11.528222 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8"} err="failed to get container status \"169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8\": rpc error: code = NotFound desc = could not find container \"169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8\": container with ID starting with 169880e32ae9b37ee1c6f2cad0555c8c570eec97c1a0b319241d8a5708f8dcc8 not found: ID does not exist" Mar 19 09:54:11.528319 master-0 kubenswrapper[15202]: I0319 09:54:11.528260 15202 scope.go:117] "RemoveContainer" containerID="8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d" Mar 19 09:54:11.528791 master-0 kubenswrapper[15202]: E0319 09:54:11.528730 15202 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d\": container with ID starting with 8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d not found: ID does not exist" containerID="8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d" Mar 19 09:54:11.528856 master-0 kubenswrapper[15202]: I0319 09:54:11.528797 15202 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d"} err="failed to get container status \"8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d\": rpc error: code = NotFound desc = could not find container \"8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d\": container with ID starting with 8ca3446e9a00717fc558b9c7800b001b29427e626be5719eb7043a72a7e61a2d not found: ID does not exist" Mar 19 09:54:11.583734 master-0 kubenswrapper[15202]: I0319 09:54:11.583644 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:11.584441 master-0 kubenswrapper[15202]: E0319 09:54:11.584404 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-log" Mar 19 09:54:11.584441 master-0 kubenswrapper[15202]: I0319 09:54:11.584435 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-log" Mar 19 09:54:11.584597 master-0 kubenswrapper[15202]: E0319 09:54:11.584549 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-api" Mar 19 09:54:11.584597 master-0 kubenswrapper[15202]: I0319 09:54:11.584564 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-api" Mar 19 09:54:11.584884 master-0 kubenswrapper[15202]: I0319 09:54:11.584849 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-api" Mar 19 09:54:11.584929 master-0 kubenswrapper[15202]: I0319 09:54:11.584893 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" containerName="nova-api-log" Mar 19 09:54:11.590874 master-0 kubenswrapper[15202]: I0319 09:54:11.590821 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:11.594156 master-0 kubenswrapper[15202]: I0319 09:54:11.593912 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 09:54:11.595338 master-0 kubenswrapper[15202]: I0319 09:54:11.595304 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:54:11.595639 master-0 kubenswrapper[15202]: I0319 09:54:11.595609 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 09:54:11.596172 master-0 kubenswrapper[15202]: I0319 09:54:11.596112 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:11.658116 master-0 kubenswrapper[15202]: I0319 09:54:11.658045 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db115b0-fcb3-4915-9acb-e8df2a364c9c-logs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.658365 master-0 kubenswrapper[15202]: I0319 09:54:11.658161 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb58n\" (UniqueName: \"kubernetes.io/projected/3db115b0-fcb3-4915-9acb-e8df2a364c9c-kube-api-access-fb58n\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.658519 master-0 kubenswrapper[15202]: I0319 09:54:11.658439 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.658573 master-0 kubenswrapper[15202]: I0319 09:54:11.658557 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.658831 master-0 kubenswrapper[15202]: I0319 09:54:11.658801 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-public-tls-certs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.658877 master-0 kubenswrapper[15202]: I0319 09:54:11.658832 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-config-data\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.760110 master-0 kubenswrapper[15202]: I0319 09:54:11.759938 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-config-data\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.760110 master-0 kubenswrapper[15202]: I0319 09:54:11.760008 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-public-tls-certs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.760110 master-0 kubenswrapper[15202]: I0319 09:54:11.760066 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db115b0-fcb3-4915-9acb-e8df2a364c9c-logs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.760456 master-0 kubenswrapper[15202]: I0319 09:54:11.760112 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb58n\" (UniqueName: \"kubernetes.io/projected/3db115b0-fcb3-4915-9acb-e8df2a364c9c-kube-api-access-fb58n\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.760456 master-0 kubenswrapper[15202]: I0319 09:54:11.760241 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.760456 master-0 kubenswrapper[15202]: I0319 09:54:11.760263 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.761538 master-0 kubenswrapper[15202]: I0319 09:54:11.761457 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db115b0-fcb3-4915-9acb-e8df2a364c9c-logs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.763360 master-0 kubenswrapper[15202]: I0319 09:54:11.763334 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-config-data\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.763883 master-0 kubenswrapper[15202]: I0319 09:54:11.763847 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-public-tls-certs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.764020 master-0 kubenswrapper[15202]: I0319 09:54:11.763994 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.764845 master-0 kubenswrapper[15202]: I0319 09:54:11.764812 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.775941 master-0 kubenswrapper[15202]: I0319 09:54:11.775891 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb58n\" (UniqueName: \"kubernetes.io/projected/3db115b0-fcb3-4915-9acb-e8df2a364c9c-kube-api-access-fb58n\") pod \"nova-api-0\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " pod="openstack/nova-api-0" Mar 19 09:54:11.915574 master-0 kubenswrapper[15202]: I0319 09:54:11.915501 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:12.420025 master-0 kubenswrapper[15202]: I0319 09:54:12.419955 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:12.485560 master-0 kubenswrapper[15202]: I0319 09:54:12.482727 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db115b0-fcb3-4915-9acb-e8df2a364c9c","Type":"ContainerStarted","Data":"56d83a49d1ab2be50b00415318bbf4822fd3f97065363894f9a4697ec34e7d9f"} Mar 19 09:54:12.856495 master-0 kubenswrapper[15202]: I0319 09:54:12.854815 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9062c234-88ef-4c57-8370-5183a2c5b88e" path="/var/lib/kubelet/pods/9062c234-88ef-4c57-8370-5183a2c5b88e/volumes" Mar 19 09:54:13.494845 master-0 kubenswrapper[15202]: I0319 09:54:13.494783 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db115b0-fcb3-4915-9acb-e8df2a364c9c","Type":"ContainerStarted","Data":"254cb233affad11b34ceadbfd0cba2d20169673986e12c5be388d216d95dc2dc"} Mar 19 09:54:13.494845 master-0 kubenswrapper[15202]: I0319 09:54:13.494845 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db115b0-fcb3-4915-9acb-e8df2a364c9c","Type":"ContainerStarted","Data":"9dcd382eb441da3dbe8cf7d364bb30f5b181cb2a6f38f7e43d81e88b81957da8"} Mar 19 09:54:13.531265 master-0 kubenswrapper[15202]: I0319 09:54:13.531174 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.531152155 podStartE2EDuration="2.531152155s" podCreationTimestamp="2026-03-19 09:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:13.519049575 +0000 UTC m=+1770.904464391" watchObservedRunningTime="2026-03-19 09:54:13.531152155 +0000 UTC m=+1770.916566971" Mar 19 09:54:14.447048 master-0 kubenswrapper[15202]: I0319 09:54:14.446964 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:14.476699 master-0 kubenswrapper[15202]: I0319 09:54:14.476599 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:14.530503 master-0 kubenswrapper[15202]: I0319 09:54:14.528061 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:54:14.765596 master-0 kubenswrapper[15202]: I0319 09:54:14.764644 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-nzzbt"] Mar 19 09:54:14.769495 master-0 kubenswrapper[15202]: I0319 09:54:14.766372 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:14.774492 master-0 kubenswrapper[15202]: I0319 09:54:14.771283 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 09:54:14.774492 master-0 kubenswrapper[15202]: I0319 09:54:14.772419 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 09:54:14.778498 master-0 kubenswrapper[15202]: I0319 09:54:14.777075 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nzzbt"] Mar 19 09:54:14.778498 master-0 kubenswrapper[15202]: I0319 09:54:14.777659 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5687765f45-jhnth" Mar 19 09:54:14.951500 master-0 kubenswrapper[15202]: I0319 09:54:14.949054 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8jc\" (UniqueName: \"kubernetes.io/projected/c5132911-1eb7-4712-b4ed-b7a745d2c36b-kube-api-access-5x8jc\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:14.951500 master-0 kubenswrapper[15202]: I0319 09:54:14.949136 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:14.951500 master-0 kubenswrapper[15202]: I0319 09:54:14.949350 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-scripts\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:14.962622 master-0 kubenswrapper[15202]: I0319 09:54:14.962514 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-config-data\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:14.963353 master-0 kubenswrapper[15202]: I0319 09:54:14.963288 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4cc6f549-55sdk"] Mar 19 09:54:14.963717 master-0 kubenswrapper[15202]: I0319 09:54:14.963665 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerName="dnsmasq-dns" containerID="cri-o://31fb14aabb59297cd0e36e36253e342817d1d743c51dbbec04577697acb1dfb0" gracePeriod=10 Mar 19 09:54:15.065538 master-0 kubenswrapper[15202]: I0319 09:54:15.065428 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x8jc\" (UniqueName: \"kubernetes.io/projected/c5132911-1eb7-4712-b4ed-b7a745d2c36b-kube-api-access-5x8jc\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.065538 master-0 kubenswrapper[15202]: I0319 09:54:15.065518 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.066160 master-0 kubenswrapper[15202]: I0319 09:54:15.065997 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-scripts\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.066319 master-0 kubenswrapper[15202]: I0319 09:54:15.066293 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-config-data\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.073897 master-0 kubenswrapper[15202]: I0319 09:54:15.073804 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.074774 master-0 kubenswrapper[15202]: I0319 09:54:15.074715 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-config-data\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.082829 master-0 kubenswrapper[15202]: I0319 09:54:15.082777 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x8jc\" (UniqueName: \"kubernetes.io/projected/c5132911-1eb7-4712-b4ed-b7a745d2c36b-kube-api-access-5x8jc\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.088979 master-0 kubenswrapper[15202]: I0319 09:54:15.087451 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-scripts\") pod \"nova-cell1-cell-mapping-nzzbt\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.117786 master-0 kubenswrapper[15202]: I0319 09:54:15.117181 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:15.528523 master-0 kubenswrapper[15202]: I0319 09:54:15.526703 15202 generic.go:334] "Generic (PLEG): container finished" podID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerID="31fb14aabb59297cd0e36e36253e342817d1d743c51dbbec04577697acb1dfb0" exitCode=0 Mar 19 09:54:15.528523 master-0 kubenswrapper[15202]: I0319 09:54:15.526795 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" event={"ID":"3a4ea6bb-8177-449b-a022-fb62033cd8c9","Type":"ContainerDied","Data":"31fb14aabb59297cd0e36e36253e342817d1d743c51dbbec04577697acb1dfb0"} Mar 19 09:54:15.634375 master-0 kubenswrapper[15202]: I0319 09:54:15.634330 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:54:15.767051 master-0 kubenswrapper[15202]: I0319 09:54:15.766960 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nzzbt"] Mar 19 09:54:15.810116 master-0 kubenswrapper[15202]: I0319 09:54:15.809225 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-nb\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810116 master-0 kubenswrapper[15202]: I0319 09:54:15.809374 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-sb\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810116 master-0 kubenswrapper[15202]: I0319 09:54:15.809797 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2562b\" (UniqueName: \"kubernetes.io/projected/3a4ea6bb-8177-449b-a022-fb62033cd8c9-kube-api-access-2562b\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810116 master-0 kubenswrapper[15202]: I0319 09:54:15.809899 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-swift-storage-0\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810728 master-0 kubenswrapper[15202]: I0319 09:54:15.810483 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-a\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810728 master-0 kubenswrapper[15202]: I0319 09:54:15.810525 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-svc\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810728 master-0 kubenswrapper[15202]: I0319 09:54:15.810554 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-config\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.810728 master-0 kubenswrapper[15202]: I0319 09:54:15.810576 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-b\") pod \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\" (UID: \"3a4ea6bb-8177-449b-a022-fb62033cd8c9\") " Mar 19 09:54:15.822287 master-0 kubenswrapper[15202]: I0319 09:54:15.821977 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4ea6bb-8177-449b-a022-fb62033cd8c9-kube-api-access-2562b" (OuterVolumeSpecName: "kube-api-access-2562b") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "kube-api-access-2562b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:15.866830 master-0 kubenswrapper[15202]: I0319 09:54:15.866667 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:15.879102 master-0 kubenswrapper[15202]: I0319 09:54:15.879039 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:15.882270 master-0 kubenswrapper[15202]: I0319 09:54:15.882206 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-config" (OuterVolumeSpecName: "config") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:15.885010 master-0 kubenswrapper[15202]: I0319 09:54:15.884931 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:15.890527 master-0 kubenswrapper[15202]: I0319 09:54:15.890460 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:15.901746 master-0 kubenswrapper[15202]: I0319 09:54:15.901679 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:15.914949 master-0 kubenswrapper[15202]: I0319 09:54:15.914889 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.914949 master-0 kubenswrapper[15202]: I0319 09:54:15.914937 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2562b\" (UniqueName: \"kubernetes.io/projected/3a4ea6bb-8177-449b-a022-fb62033cd8c9-kube-api-access-2562b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.914949 master-0 kubenswrapper[15202]: I0319 09:54:15.914953 15202 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.915431 master-0 kubenswrapper[15202]: I0319 09:54:15.914964 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.915431 master-0 kubenswrapper[15202]: I0319 09:54:15.914977 15202 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.915431 master-0 kubenswrapper[15202]: I0319 09:54:15.914987 15202 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.915431 master-0 kubenswrapper[15202]: I0319 09:54:15.914997 15202 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:15.917719 master-0 kubenswrapper[15202]: I0319 09:54:15.917691 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a4ea6bb-8177-449b-a022-fb62033cd8c9" (UID: "3a4ea6bb-8177-449b-a022-fb62033cd8c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:16.016744 master-0 kubenswrapper[15202]: I0319 09:54:16.016671 15202 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4ea6bb-8177-449b-a022-fb62033cd8c9-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:16.539700 master-0 kubenswrapper[15202]: I0319 09:54:16.539643 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" event={"ID":"3a4ea6bb-8177-449b-a022-fb62033cd8c9","Type":"ContainerDied","Data":"90e2789902082dd41b77a267a99c3537ad92d1ad7cfa99ee0a1e1a4058b055d5"} Mar 19 09:54:16.539700 master-0 kubenswrapper[15202]: I0319 09:54:16.539707 15202 scope.go:117] "RemoveContainer" containerID="31fb14aabb59297cd0e36e36253e342817d1d743c51dbbec04577697acb1dfb0" Mar 19 09:54:16.539950 master-0 kubenswrapper[15202]: I0319 09:54:16.539867 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4cc6f549-55sdk" Mar 19 09:54:16.546701 master-0 kubenswrapper[15202]: I0319 09:54:16.546638 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nzzbt" event={"ID":"c5132911-1eb7-4712-b4ed-b7a745d2c36b","Type":"ContainerStarted","Data":"498416a58915bdf931863577df951e3bcd558b68eaccae8e56c46022003dd2e2"} Mar 19 09:54:16.546792 master-0 kubenswrapper[15202]: I0319 09:54:16.546710 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nzzbt" event={"ID":"c5132911-1eb7-4712-b4ed-b7a745d2c36b","Type":"ContainerStarted","Data":"390719a7702d62c7d4ac149eccae4e7488f1cf10b48e92576cf8359cbead5341"} Mar 19 09:54:16.566865 master-0 kubenswrapper[15202]: I0319 09:54:16.566806 15202 scope.go:117] "RemoveContainer" containerID="8d26556787d3f6581b20b2c6d142273913ed9a54cf40777a695cffab286be379" Mar 19 09:54:16.612527 master-0 kubenswrapper[15202]: I0319 09:54:16.612402 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-nzzbt" podStartSLOduration=2.612377135 podStartE2EDuration="2.612377135s" podCreationTimestamp="2026-03-19 09:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:16.581602843 +0000 UTC m=+1773.967017659" watchObservedRunningTime="2026-03-19 09:54:16.612377135 +0000 UTC m=+1773.997791951" Mar 19 09:54:16.614650 master-0 kubenswrapper[15202]: I0319 09:54:16.614609 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4cc6f549-55sdk"] Mar 19 09:54:16.630027 master-0 kubenswrapper[15202]: I0319 09:54:16.629959 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4cc6f549-55sdk"] Mar 19 09:54:16.829385 master-0 kubenswrapper[15202]: I0319 09:54:16.828385 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" path="/var/lib/kubelet/pods/3a4ea6bb-8177-449b-a022-fb62033cd8c9/volumes" Mar 19 09:54:20.623716 master-0 kubenswrapper[15202]: I0319 09:54:20.623661 15202 generic.go:334] "Generic (PLEG): container finished" podID="c5132911-1eb7-4712-b4ed-b7a745d2c36b" containerID="498416a58915bdf931863577df951e3bcd558b68eaccae8e56c46022003dd2e2" exitCode=0 Mar 19 09:54:20.624266 master-0 kubenswrapper[15202]: I0319 09:54:20.623722 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nzzbt" event={"ID":"c5132911-1eb7-4712-b4ed-b7a745d2c36b","Type":"ContainerDied","Data":"498416a58915bdf931863577df951e3bcd558b68eaccae8e56c46022003dd2e2"} Mar 19 09:54:21.917103 master-0 kubenswrapper[15202]: I0319 09:54:21.915964 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:54:21.917103 master-0 kubenswrapper[15202]: I0319 09:54:21.916031 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:54:22.175313 master-0 kubenswrapper[15202]: I0319 09:54:22.175220 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:22.286927 master-0 kubenswrapper[15202]: I0319 09:54:22.286832 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-config-data\") pod \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " Mar 19 09:54:22.287511 master-0 kubenswrapper[15202]: I0319 09:54:22.286965 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x8jc\" (UniqueName: \"kubernetes.io/projected/c5132911-1eb7-4712-b4ed-b7a745d2c36b-kube-api-access-5x8jc\") pod \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " Mar 19 09:54:22.287511 master-0 kubenswrapper[15202]: I0319 09:54:22.287166 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-scripts\") pod \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " Mar 19 09:54:22.287511 master-0 kubenswrapper[15202]: I0319 09:54:22.287311 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-combined-ca-bundle\") pod \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\" (UID: \"c5132911-1eb7-4712-b4ed-b7a745d2c36b\") " Mar 19 09:54:22.293744 master-0 kubenswrapper[15202]: I0319 09:54:22.293702 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-scripts" (OuterVolumeSpecName: "scripts") pod "c5132911-1eb7-4712-b4ed-b7a745d2c36b" (UID: "c5132911-1eb7-4712-b4ed-b7a745d2c36b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:22.293887 master-0 kubenswrapper[15202]: I0319 09:54:22.293797 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5132911-1eb7-4712-b4ed-b7a745d2c36b-kube-api-access-5x8jc" (OuterVolumeSpecName: "kube-api-access-5x8jc") pod "c5132911-1eb7-4712-b4ed-b7a745d2c36b" (UID: "c5132911-1eb7-4712-b4ed-b7a745d2c36b"). InnerVolumeSpecName "kube-api-access-5x8jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:22.314429 master-0 kubenswrapper[15202]: I0319 09:54:22.314354 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5132911-1eb7-4712-b4ed-b7a745d2c36b" (UID: "c5132911-1eb7-4712-b4ed-b7a745d2c36b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:22.330279 master-0 kubenswrapper[15202]: I0319 09:54:22.330204 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-config-data" (OuterVolumeSpecName: "config-data") pod "c5132911-1eb7-4712-b4ed-b7a745d2c36b" (UID: "c5132911-1eb7-4712-b4ed-b7a745d2c36b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:22.390785 master-0 kubenswrapper[15202]: I0319 09:54:22.390642 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x8jc\" (UniqueName: \"kubernetes.io/projected/c5132911-1eb7-4712-b4ed-b7a745d2c36b-kube-api-access-5x8jc\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:22.390785 master-0 kubenswrapper[15202]: I0319 09:54:22.390692 15202 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:22.390785 master-0 kubenswrapper[15202]: I0319 09:54:22.390736 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:22.390785 master-0 kubenswrapper[15202]: I0319 09:54:22.390752 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5132911-1eb7-4712-b4ed-b7a745d2c36b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:22.659306 master-0 kubenswrapper[15202]: I0319 09:54:22.659253 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nzzbt" event={"ID":"c5132911-1eb7-4712-b4ed-b7a745d2c36b","Type":"ContainerDied","Data":"390719a7702d62c7d4ac149eccae4e7488f1cf10b48e92576cf8359cbead5341"} Mar 19 09:54:22.659306 master-0 kubenswrapper[15202]: I0319 09:54:22.659301 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390719a7702d62c7d4ac149eccae4e7488f1cf10b48e92576cf8359cbead5341" Mar 19 09:54:22.659657 master-0 kubenswrapper[15202]: I0319 09:54:22.659327 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nzzbt" Mar 19 09:54:22.964925 master-0 kubenswrapper[15202]: I0319 09:54:22.964859 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.13:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:54:22.965588 master-0 kubenswrapper[15202]: I0319 09:54:22.965271 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.13:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:54:23.071073 master-0 kubenswrapper[15202]: I0319 09:54:23.071020 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:23.071882 master-0 kubenswrapper[15202]: I0319 09:54:23.071842 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-log" containerID="cri-o://9dcd382eb441da3dbe8cf7d364bb30f5b181cb2a6f38f7e43d81e88b81957da8" gracePeriod=30 Mar 19 09:54:23.072238 master-0 kubenswrapper[15202]: I0319 09:54:23.072086 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-api" containerID="cri-o://254cb233affad11b34ceadbfd0cba2d20169673986e12c5be388d216d95dc2dc" gracePeriod=30 Mar 19 09:54:23.097413 master-0 kubenswrapper[15202]: I0319 09:54:23.097355 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:54:23.097697 master-0 kubenswrapper[15202]: I0319 09:54:23.097648 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" containerName="nova-scheduler-scheduler" containerID="cri-o://ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5" gracePeriod=30 Mar 19 09:54:23.110076 master-0 kubenswrapper[15202]: I0319 09:54:23.110012 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:54:23.110325 master-0 kubenswrapper[15202]: I0319 09:54:23.110285 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-log" containerID="cri-o://295e7e8891c3942e3f85dc66aae2ae55fdb84078ee0f214962c30c10d15dc2b9" gracePeriod=30 Mar 19 09:54:23.110424 master-0 kubenswrapper[15202]: I0319 09:54:23.110362 15202 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-metadata" containerID="cri-o://3ffe27364d3e76eed5acd2b2cde951376df510d6b107f537aeb7ed98e95e32a7" gracePeriod=30 Mar 19 09:54:23.672920 master-0 kubenswrapper[15202]: I0319 09:54:23.672751 15202 generic.go:334] "Generic (PLEG): container finished" podID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerID="9dcd382eb441da3dbe8cf7d364bb30f5b181cb2a6f38f7e43d81e88b81957da8" exitCode=143 Mar 19 09:54:23.672920 master-0 kubenswrapper[15202]: I0319 09:54:23.672841 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db115b0-fcb3-4915-9acb-e8df2a364c9c","Type":"ContainerDied","Data":"9dcd382eb441da3dbe8cf7d364bb30f5b181cb2a6f38f7e43d81e88b81957da8"} Mar 19 09:54:23.675325 master-0 kubenswrapper[15202]: I0319 09:54:23.675294 15202 generic.go:334] "Generic (PLEG): container finished" podID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerID="295e7e8891c3942e3f85dc66aae2ae55fdb84078ee0f214962c30c10d15dc2b9" exitCode=143 Mar 19 09:54:23.675406 master-0 kubenswrapper[15202]: I0319 09:54:23.675340 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0bb69fd-8511-4feb-949e-3ca2388274dc","Type":"ContainerDied","Data":"295e7e8891c3942e3f85dc66aae2ae55fdb84078ee0f214962c30c10d15dc2b9"} Mar 19 09:54:24.460786 master-0 kubenswrapper[15202]: E0319 09:54:24.460707 15202 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:54:24.462728 master-0 kubenswrapper[15202]: E0319 09:54:24.462572 15202 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:54:24.465167 master-0 kubenswrapper[15202]: E0319 09:54:24.464088 15202 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:54:24.465167 master-0 kubenswrapper[15202]: E0319 09:54:24.464169 15202 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" containerName="nova-scheduler-scheduler" Mar 19 09:54:26.723441 master-0 kubenswrapper[15202]: I0319 09:54:26.723389 15202 generic.go:334] "Generic (PLEG): container finished" podID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerID="3ffe27364d3e76eed5acd2b2cde951376df510d6b107f537aeb7ed98e95e32a7" exitCode=0 Mar 19 09:54:26.723975 master-0 kubenswrapper[15202]: I0319 09:54:26.723445 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0bb69fd-8511-4feb-949e-3ca2388274dc","Type":"ContainerDied","Data":"3ffe27364d3e76eed5acd2b2cde951376df510d6b107f537aeb7ed98e95e32a7"} Mar 19 09:54:26.883672 master-0 kubenswrapper[15202]: I0319 09:54:26.883600 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:54:27.048635 master-0 kubenswrapper[15202]: I0319 09:54:27.028894 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0bb69fd-8511-4feb-949e-3ca2388274dc-logs\") pod \"d0bb69fd-8511-4feb-949e-3ca2388274dc\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " Mar 19 09:54:27.048635 master-0 kubenswrapper[15202]: I0319 09:54:27.029011 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-combined-ca-bundle\") pod \"d0bb69fd-8511-4feb-949e-3ca2388274dc\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " Mar 19 09:54:27.048635 master-0 kubenswrapper[15202]: I0319 09:54:27.029150 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-nova-metadata-tls-certs\") pod \"d0bb69fd-8511-4feb-949e-3ca2388274dc\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " Mar 19 09:54:27.048635 master-0 kubenswrapper[15202]: I0319 09:54:27.029273 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-config-data\") pod \"d0bb69fd-8511-4feb-949e-3ca2388274dc\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " Mar 19 09:54:27.048635 master-0 kubenswrapper[15202]: I0319 09:54:27.029347 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf2g7\" (UniqueName: \"kubernetes.io/projected/d0bb69fd-8511-4feb-949e-3ca2388274dc-kube-api-access-tf2g7\") pod \"d0bb69fd-8511-4feb-949e-3ca2388274dc\" (UID: \"d0bb69fd-8511-4feb-949e-3ca2388274dc\") " Mar 19 09:54:27.048635 master-0 kubenswrapper[15202]: I0319 09:54:27.043718 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bb69fd-8511-4feb-949e-3ca2388274dc-kube-api-access-tf2g7" (OuterVolumeSpecName: "kube-api-access-tf2g7") pod "d0bb69fd-8511-4feb-949e-3ca2388274dc" (UID: "d0bb69fd-8511-4feb-949e-3ca2388274dc"). InnerVolumeSpecName "kube-api-access-tf2g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:27.061497 master-0 kubenswrapper[15202]: I0319 09:54:27.059817 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0bb69fd-8511-4feb-949e-3ca2388274dc-logs" (OuterVolumeSpecName: "logs") pod "d0bb69fd-8511-4feb-949e-3ca2388274dc" (UID: "d0bb69fd-8511-4feb-949e-3ca2388274dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:54:27.152508 master-0 kubenswrapper[15202]: I0319 09:54:27.146328 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf2g7\" (UniqueName: \"kubernetes.io/projected/d0bb69fd-8511-4feb-949e-3ca2388274dc-kube-api-access-tf2g7\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:27.152508 master-0 kubenswrapper[15202]: I0319 09:54:27.146396 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0bb69fd-8511-4feb-949e-3ca2388274dc-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:27.152508 master-0 kubenswrapper[15202]: I0319 09:54:27.146621 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0bb69fd-8511-4feb-949e-3ca2388274dc" (UID: "d0bb69fd-8511-4feb-949e-3ca2388274dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:27.206600 master-0 kubenswrapper[15202]: I0319 09:54:27.205515 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d0bb69fd-8511-4feb-949e-3ca2388274dc" (UID: "d0bb69fd-8511-4feb-949e-3ca2388274dc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:27.223675 master-0 kubenswrapper[15202]: I0319 09:54:27.223624 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-config-data" (OuterVolumeSpecName: "config-data") pod "d0bb69fd-8511-4feb-949e-3ca2388274dc" (UID: "d0bb69fd-8511-4feb-949e-3ca2388274dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:27.248149 master-0 kubenswrapper[15202]: I0319 09:54:27.248093 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:27.248149 master-0 kubenswrapper[15202]: I0319 09:54:27.248141 15202 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:27.248149 master-0 kubenswrapper[15202]: I0319 09:54:27.248154 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bb69fd-8511-4feb-949e-3ca2388274dc-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:27.744710 master-0 kubenswrapper[15202]: I0319 09:54:27.744589 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d0bb69fd-8511-4feb-949e-3ca2388274dc","Type":"ContainerDied","Data":"b14327301e72161e1755bb74569f26906a5a4ddddd14697258bdb26b5025baad"} Mar 19 09:54:27.745306 master-0 kubenswrapper[15202]: I0319 09:54:27.744711 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:54:27.745306 master-0 kubenswrapper[15202]: I0319 09:54:27.744771 15202 scope.go:117] "RemoveContainer" containerID="3ffe27364d3e76eed5acd2b2cde951376df510d6b107f537aeb7ed98e95e32a7" Mar 19 09:54:27.771400 master-0 kubenswrapper[15202]: I0319 09:54:27.771321 15202 scope.go:117] "RemoveContainer" containerID="295e7e8891c3942e3f85dc66aae2ae55fdb84078ee0f214962c30c10d15dc2b9" Mar 19 09:54:27.801647 master-0 kubenswrapper[15202]: I0319 09:54:27.801555 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:54:27.819172 master-0 kubenswrapper[15202]: I0319 09:54:27.819110 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.853653 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: E0319 09:54:27.854445 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-log" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.854510 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-log" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: E0319 09:54:27.854550 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerName="init" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.854560 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerName="init" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: E0319 09:54:27.854612 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerName="dnsmasq-dns" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.854624 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerName="dnsmasq-dns" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: E0319 09:54:27.854638 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5132911-1eb7-4712-b4ed-b7a745d2c36b" containerName="nova-manage" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.854645 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5132911-1eb7-4712-b4ed-b7a745d2c36b" containerName="nova-manage" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: E0319 09:54:27.854655 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-metadata" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.854664 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-metadata" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.855022 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-log" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.855049 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4ea6bb-8177-449b-a022-fb62033cd8c9" containerName="dnsmasq-dns" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.855068 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5132911-1eb7-4712-b4ed-b7a745d2c36b" containerName="nova-manage" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.855084 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" containerName="nova-metadata-metadata" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.856790 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:54:27.878729 master-0 kubenswrapper[15202]: I0319 09:54:27.875756 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:54:27.879853 master-0 kubenswrapper[15202]: I0319 09:54:27.879771 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:54:27.882795 master-0 kubenswrapper[15202]: I0319 09:54:27.880577 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 09:54:27.882795 master-0 kubenswrapper[15202]: I0319 09:54:27.882172 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h56fm\" (UniqueName: \"kubernetes.io/projected/17791259-f6c3-44a3-9ee1-f87b6c7db780-kube-api-access-h56fm\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.882795 master-0 kubenswrapper[15202]: I0319 09:54:27.882283 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-config-data\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.882795 master-0 kubenswrapper[15202]: I0319 09:54:27.882725 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.882975 master-0 kubenswrapper[15202]: I0319 09:54:27.882879 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17791259-f6c3-44a3-9ee1-f87b6c7db780-logs\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.882975 master-0 kubenswrapper[15202]: I0319 09:54:27.882950 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.989709 master-0 kubenswrapper[15202]: I0319 09:54:27.985629 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h56fm\" (UniqueName: \"kubernetes.io/projected/17791259-f6c3-44a3-9ee1-f87b6c7db780-kube-api-access-h56fm\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.989709 master-0 kubenswrapper[15202]: I0319 09:54:27.985705 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-config-data\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.989709 master-0 kubenswrapper[15202]: I0319 09:54:27.986044 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.989709 master-0 kubenswrapper[15202]: I0319 09:54:27.986459 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17791259-f6c3-44a3-9ee1-f87b6c7db780-logs\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.989709 master-0 kubenswrapper[15202]: I0319 09:54:27.986648 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.989709 master-0 kubenswrapper[15202]: I0319 09:54:27.987085 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17791259-f6c3-44a3-9ee1-f87b6c7db780-logs\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.990259 master-0 kubenswrapper[15202]: I0319 09:54:27.990083 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.990354 master-0 kubenswrapper[15202]: I0319 09:54:27.990295 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:27.998566 master-0 kubenswrapper[15202]: I0319 09:54:27.996017 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17791259-f6c3-44a3-9ee1-f87b6c7db780-config-data\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:28.007489 master-0 kubenswrapper[15202]: I0319 09:54:28.007127 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h56fm\" (UniqueName: \"kubernetes.io/projected/17791259-f6c3-44a3-9ee1-f87b6c7db780-kube-api-access-h56fm\") pod \"nova-metadata-0\" (UID: \"17791259-f6c3-44a3-9ee1-f87b6c7db780\") " pod="openstack/nova-metadata-0" Mar 19 09:54:28.221198 master-0 kubenswrapper[15202]: I0319 09:54:28.221093 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:54:28.733581 master-0 kubenswrapper[15202]: I0319 09:54:28.733168 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:54:28.773714 master-0 kubenswrapper[15202]: I0319 09:54:28.773582 15202 generic.go:334] "Generic (PLEG): container finished" podID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerID="254cb233affad11b34ceadbfd0cba2d20169673986e12c5be388d216d95dc2dc" exitCode=0 Mar 19 09:54:28.773714 master-0 kubenswrapper[15202]: I0319 09:54:28.773656 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db115b0-fcb3-4915-9acb-e8df2a364c9c","Type":"ContainerDied","Data":"254cb233affad11b34ceadbfd0cba2d20169673986e12c5be388d216d95dc2dc"} Mar 19 09:54:28.775024 master-0 kubenswrapper[15202]: I0319 09:54:28.774977 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17791259-f6c3-44a3-9ee1-f87b6c7db780","Type":"ContainerStarted","Data":"37537357259c359b267842db3c1485cc4e4a2012b60b575a1af28ffbb07ef8b8"} Mar 19 09:54:28.778303 master-0 kubenswrapper[15202]: I0319 09:54:28.778248 15202 generic.go:334] "Generic (PLEG): container finished" podID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" containerID="ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5" exitCode=0 Mar 19 09:54:28.778303 master-0 kubenswrapper[15202]: I0319 09:54:28.778275 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"657520fd-77a9-49cd-a2c0-5b3f9da06c59","Type":"ContainerDied","Data":"ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5"} Mar 19 09:54:28.832023 master-0 kubenswrapper[15202]: I0319 09:54:28.831927 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bb69fd-8511-4feb-949e-3ca2388274dc" path="/var/lib/kubelet/pods/d0bb69fd-8511-4feb-949e-3ca2388274dc/volumes" Mar 19 09:54:28.968851 master-0 kubenswrapper[15202]: I0319 09:54:28.968808 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:54:29.030161 master-0 kubenswrapper[15202]: I0319 09:54:29.030063 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-combined-ca-bundle\") pod \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " Mar 19 09:54:29.030588 master-0 kubenswrapper[15202]: I0319 09:54:29.030570 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rstf\" (UniqueName: \"kubernetes.io/projected/657520fd-77a9-49cd-a2c0-5b3f9da06c59-kube-api-access-9rstf\") pod \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " Mar 19 09:54:29.030949 master-0 kubenswrapper[15202]: I0319 09:54:29.030927 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-config-data\") pod \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\" (UID: \"657520fd-77a9-49cd-a2c0-5b3f9da06c59\") " Mar 19 09:54:29.043130 master-0 kubenswrapper[15202]: I0319 09:54:29.043034 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657520fd-77a9-49cd-a2c0-5b3f9da06c59-kube-api-access-9rstf" (OuterVolumeSpecName: "kube-api-access-9rstf") pod "657520fd-77a9-49cd-a2c0-5b3f9da06c59" (UID: "657520fd-77a9-49cd-a2c0-5b3f9da06c59"). InnerVolumeSpecName "kube-api-access-9rstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:29.053670 master-0 kubenswrapper[15202]: I0319 09:54:29.053260 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rstf\" (UniqueName: \"kubernetes.io/projected/657520fd-77a9-49cd-a2c0-5b3f9da06c59-kube-api-access-9rstf\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.073605 master-0 kubenswrapper[15202]: I0319 09:54:29.073398 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "657520fd-77a9-49cd-a2c0-5b3f9da06c59" (UID: "657520fd-77a9-49cd-a2c0-5b3f9da06c59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:29.081858 master-0 kubenswrapper[15202]: I0319 09:54:29.081805 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-config-data" (OuterVolumeSpecName: "config-data") pod "657520fd-77a9-49cd-a2c0-5b3f9da06c59" (UID: "657520fd-77a9-49cd-a2c0-5b3f9da06c59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:29.117288 master-0 kubenswrapper[15202]: I0319 09:54:29.117234 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:29.154276 master-0 kubenswrapper[15202]: I0319 09:54:29.154184 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-public-tls-certs\") pod \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " Mar 19 09:54:29.154276 master-0 kubenswrapper[15202]: I0319 09:54:29.154252 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-combined-ca-bundle\") pod \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " Mar 19 09:54:29.154584 master-0 kubenswrapper[15202]: I0319 09:54:29.154298 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-internal-tls-certs\") pod \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " Mar 19 09:54:29.154584 master-0 kubenswrapper[15202]: I0319 09:54:29.154349 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db115b0-fcb3-4915-9acb-e8df2a364c9c-logs\") pod \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " Mar 19 09:54:29.154584 master-0 kubenswrapper[15202]: I0319 09:54:29.154377 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb58n\" (UniqueName: \"kubernetes.io/projected/3db115b0-fcb3-4915-9acb-e8df2a364c9c-kube-api-access-fb58n\") pod \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " Mar 19 09:54:29.154584 master-0 kubenswrapper[15202]: I0319 09:54:29.154511 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-config-data\") pod \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\" (UID: \"3db115b0-fcb3-4915-9acb-e8df2a364c9c\") " Mar 19 09:54:29.155033 master-0 kubenswrapper[15202]: I0319 09:54:29.155002 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.155033 master-0 kubenswrapper[15202]: I0319 09:54:29.155022 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657520fd-77a9-49cd-a2c0-5b3f9da06c59-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.160715 master-0 kubenswrapper[15202]: I0319 09:54:29.160607 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db115b0-fcb3-4915-9acb-e8df2a364c9c-logs" (OuterVolumeSpecName: "logs") pod "3db115b0-fcb3-4915-9acb-e8df2a364c9c" (UID: "3db115b0-fcb3-4915-9acb-e8df2a364c9c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:54:29.184237 master-0 kubenswrapper[15202]: I0319 09:54:29.184168 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db115b0-fcb3-4915-9acb-e8df2a364c9c-kube-api-access-fb58n" (OuterVolumeSpecName: "kube-api-access-fb58n") pod "3db115b0-fcb3-4915-9acb-e8df2a364c9c" (UID: "3db115b0-fcb3-4915-9acb-e8df2a364c9c"). InnerVolumeSpecName "kube-api-access-fb58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:29.192800 master-0 kubenswrapper[15202]: I0319 09:54:29.192728 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db115b0-fcb3-4915-9acb-e8df2a364c9c" (UID: "3db115b0-fcb3-4915-9acb-e8df2a364c9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:29.220750 master-0 kubenswrapper[15202]: I0319 09:54:29.220665 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-config-data" (OuterVolumeSpecName: "config-data") pod "3db115b0-fcb3-4915-9acb-e8df2a364c9c" (UID: "3db115b0-fcb3-4915-9acb-e8df2a364c9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:29.236435 master-0 kubenswrapper[15202]: I0319 09:54:29.236367 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3db115b0-fcb3-4915-9acb-e8df2a364c9c" (UID: "3db115b0-fcb3-4915-9acb-e8df2a364c9c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:29.252525 master-0 kubenswrapper[15202]: I0319 09:54:29.250786 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3db115b0-fcb3-4915-9acb-e8df2a364c9c" (UID: "3db115b0-fcb3-4915-9acb-e8df2a364c9c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:29.258428 master-0 kubenswrapper[15202]: I0319 09:54:29.258365 15202 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.258428 master-0 kubenswrapper[15202]: I0319 09:54:29.258427 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.258428 master-0 kubenswrapper[15202]: I0319 09:54:29.258444 15202 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.258922 master-0 kubenswrapper[15202]: I0319 09:54:29.258453 15202 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db115b0-fcb3-4915-9acb-e8df2a364c9c-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.258922 master-0 kubenswrapper[15202]: I0319 09:54:29.258540 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb58n\" (UniqueName: \"kubernetes.io/projected/3db115b0-fcb3-4915-9acb-e8df2a364c9c-kube-api-access-fb58n\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.258922 master-0 kubenswrapper[15202]: I0319 09:54:29.258553 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db115b0-fcb3-4915-9acb-e8df2a364c9c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 09:54:29.790595 master-0 kubenswrapper[15202]: I0319 09:54:29.790532 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3db115b0-fcb3-4915-9acb-e8df2a364c9c","Type":"ContainerDied","Data":"56d83a49d1ab2be50b00415318bbf4822fd3f97065363894f9a4697ec34e7d9f"} Mar 19 09:54:29.791102 master-0 kubenswrapper[15202]: I0319 09:54:29.790601 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:29.791102 master-0 kubenswrapper[15202]: I0319 09:54:29.790608 15202 scope.go:117] "RemoveContainer" containerID="254cb233affad11b34ceadbfd0cba2d20169673986e12c5be388d216d95dc2dc" Mar 19 09:54:29.794562 master-0 kubenswrapper[15202]: I0319 09:54:29.794512 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17791259-f6c3-44a3-9ee1-f87b6c7db780","Type":"ContainerStarted","Data":"a12a27d7de027b84599909651eae917e1bf2f7b4e1a4ffb7d0f210a2bc1055a0"} Mar 19 09:54:29.794673 master-0 kubenswrapper[15202]: I0319 09:54:29.794598 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"17791259-f6c3-44a3-9ee1-f87b6c7db780","Type":"ContainerStarted","Data":"663e2114d39af158611d3f4cc14b1447bf652237669f5c17c99039d47e73933b"} Mar 19 09:54:29.796664 master-0 kubenswrapper[15202]: I0319 09:54:29.796632 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"657520fd-77a9-49cd-a2c0-5b3f9da06c59","Type":"ContainerDied","Data":"7591e59f39d89f026746cc3ed63c9b9d05d8e8f98f7ca44c6eb3d8233aec84c5"} Mar 19 09:54:29.796740 master-0 kubenswrapper[15202]: I0319 09:54:29.796671 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:54:29.820081 master-0 kubenswrapper[15202]: I0319 09:54:29.819452 15202 scope.go:117] "RemoveContainer" containerID="9dcd382eb441da3dbe8cf7d364bb30f5b181cb2a6f38f7e43d81e88b81957da8" Mar 19 09:54:29.854769 master-0 kubenswrapper[15202]: I0319 09:54:29.854690 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.854666742 podStartE2EDuration="2.854666742s" podCreationTimestamp="2026-03-19 09:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:29.843911095 +0000 UTC m=+1787.229325911" watchObservedRunningTime="2026-03-19 09:54:29.854666742 +0000 UTC m=+1787.240081578" Mar 19 09:54:29.863891 master-0 kubenswrapper[15202]: I0319 09:54:29.863851 15202 scope.go:117] "RemoveContainer" containerID="ebd8507e54f4257464b1f066db3d70a78f36d2b95de956d7d9941a1a502b56f5" Mar 19 09:54:29.881340 master-0 kubenswrapper[15202]: I0319 09:54:29.881304 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:29.921580 master-0 kubenswrapper[15202]: I0319 09:54:29.920537 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:29.948395 master-0 kubenswrapper[15202]: I0319 09:54:29.948156 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:54:29.960240 master-0 kubenswrapper[15202]: I0319 09:54:29.960169 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:54:29.971774 master-0 kubenswrapper[15202]: I0319 09:54:29.971664 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:29.972798 master-0 kubenswrapper[15202]: E0319 09:54:29.972582 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-api" Mar 19 09:54:29.972798 master-0 kubenswrapper[15202]: I0319 09:54:29.972628 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-api" Mar 19 09:54:29.972798 master-0 kubenswrapper[15202]: E0319 09:54:29.972650 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-log" Mar 19 09:54:29.972798 master-0 kubenswrapper[15202]: I0319 09:54:29.972659 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-log" Mar 19 09:54:29.972798 master-0 kubenswrapper[15202]: E0319 09:54:29.972737 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" containerName="nova-scheduler-scheduler" Mar 19 09:54:29.972798 master-0 kubenswrapper[15202]: I0319 09:54:29.972752 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" containerName="nova-scheduler-scheduler" Mar 19 09:54:29.973269 master-0 kubenswrapper[15202]: I0319 09:54:29.973246 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-log" Mar 19 09:54:29.973334 master-0 kubenswrapper[15202]: I0319 09:54:29.973272 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" containerName="nova-scheduler-scheduler" Mar 19 09:54:29.973334 master-0 kubenswrapper[15202]: I0319 09:54:29.973312 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" containerName="nova-api-api" Mar 19 09:54:29.993555 master-0 kubenswrapper[15202]: I0319 09:54:29.993300 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:29.997496 master-0 kubenswrapper[15202]: I0319 09:54:29.997435 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 09:54:29.997902 master-0 kubenswrapper[15202]: I0319 09:54:29.997510 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 09:54:29.998113 master-0 kubenswrapper[15202]: I0319 09:54:29.997579 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:54:30.006375 master-0 kubenswrapper[15202]: I0319 09:54:30.006228 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:30.019219 master-0 kubenswrapper[15202]: I0319 09:54:30.019125 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:54:30.021407 master-0 kubenswrapper[15202]: I0319 09:54:30.021337 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:54:30.027088 master-0 kubenswrapper[15202]: I0319 09:54:30.027019 15202 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 09:54:30.029391 master-0 kubenswrapper[15202]: I0319 09:54:30.029350 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:54:30.092565 master-0 kubenswrapper[15202]: I0319 09:54:30.092412 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63107e92-79e8-45d6-af32-05d718986530-logs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.092865 master-0 kubenswrapper[15202]: I0319 09:54:30.092836 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-public-tls-certs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.093117 master-0 kubenswrapper[15202]: I0319 09:54:30.093044 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661ff0ec-2637-474a-b47e-658ac7e62908-config-data\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.093329 master-0 kubenswrapper[15202]: I0319 09:54:30.093310 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.093497 master-0 kubenswrapper[15202]: I0319 09:54:30.093457 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9t5b\" (UniqueName: \"kubernetes.io/projected/63107e92-79e8-45d6-af32-05d718986530-kube-api-access-j9t5b\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.093630 master-0 kubenswrapper[15202]: I0319 09:54:30.093609 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661ff0ec-2637-474a-b47e-658ac7e62908-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.093758 master-0 kubenswrapper[15202]: I0319 09:54:30.093739 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-config-data\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.094082 master-0 kubenswrapper[15202]: I0319 09:54:30.094062 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbhm\" (UniqueName: \"kubernetes.io/projected/661ff0ec-2637-474a-b47e-658ac7e62908-kube-api-access-jcbhm\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.094280 master-0 kubenswrapper[15202]: I0319 09:54:30.094244 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.196408 master-0 kubenswrapper[15202]: I0319 09:54:30.196305 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63107e92-79e8-45d6-af32-05d718986530-logs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.196408 master-0 kubenswrapper[15202]: I0319 09:54:30.196368 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-public-tls-certs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.196408 master-0 kubenswrapper[15202]: I0319 09:54:30.196394 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661ff0ec-2637-474a-b47e-658ac7e62908-config-data\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.196743 master-0 kubenswrapper[15202]: I0319 09:54:30.196486 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.196743 master-0 kubenswrapper[15202]: I0319 09:54:30.196519 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9t5b\" (UniqueName: \"kubernetes.io/projected/63107e92-79e8-45d6-af32-05d718986530-kube-api-access-j9t5b\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.196743 master-0 kubenswrapper[15202]: I0319 09:54:30.196543 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661ff0ec-2637-474a-b47e-658ac7e62908-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.196743 master-0 kubenswrapper[15202]: I0319 09:54:30.196568 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-config-data\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.196743 master-0 kubenswrapper[15202]: I0319 09:54:30.196657 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbhm\" (UniqueName: \"kubernetes.io/projected/661ff0ec-2637-474a-b47e-658ac7e62908-kube-api-access-jcbhm\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.196743 master-0 kubenswrapper[15202]: I0319 09:54:30.196696 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.198231 master-0 kubenswrapper[15202]: I0319 09:54:30.198036 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63107e92-79e8-45d6-af32-05d718986530-logs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.202862 master-0 kubenswrapper[15202]: I0319 09:54:30.202279 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-config-data\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.202862 master-0 kubenswrapper[15202]: I0319 09:54:30.202453 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661ff0ec-2637-474a-b47e-658ac7e62908-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.202862 master-0 kubenswrapper[15202]: I0319 09:54:30.202829 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661ff0ec-2637-474a-b47e-658ac7e62908-config-data\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.203012 master-0 kubenswrapper[15202]: I0319 09:54:30.202972 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-public-tls-certs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.203785 master-0 kubenswrapper[15202]: I0319 09:54:30.203671 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.207594 master-0 kubenswrapper[15202]: I0319 09:54:30.207450 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63107e92-79e8-45d6-af32-05d718986530-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.214062 master-0 kubenswrapper[15202]: I0319 09:54:30.214016 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbhm\" (UniqueName: \"kubernetes.io/projected/661ff0ec-2637-474a-b47e-658ac7e62908-kube-api-access-jcbhm\") pod \"nova-scheduler-0\" (UID: \"661ff0ec-2637-474a-b47e-658ac7e62908\") " pod="openstack/nova-scheduler-0" Mar 19 09:54:30.218596 master-0 kubenswrapper[15202]: I0319 09:54:30.218528 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9t5b\" (UniqueName: \"kubernetes.io/projected/63107e92-79e8-45d6-af32-05d718986530-kube-api-access-j9t5b\") pod \"nova-api-0\" (UID: \"63107e92-79e8-45d6-af32-05d718986530\") " pod="openstack/nova-api-0" Mar 19 09:54:30.323505 master-0 kubenswrapper[15202]: I0319 09:54:30.323087 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:54:30.346048 master-0 kubenswrapper[15202]: I0319 09:54:30.345926 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:54:30.827610 master-0 kubenswrapper[15202]: I0319 09:54:30.826953 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db115b0-fcb3-4915-9acb-e8df2a364c9c" path="/var/lib/kubelet/pods/3db115b0-fcb3-4915-9acb-e8df2a364c9c/volumes" Mar 19 09:54:30.828733 master-0 kubenswrapper[15202]: I0319 09:54:30.827697 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657520fd-77a9-49cd-a2c0-5b3f9da06c59" path="/var/lib/kubelet/pods/657520fd-77a9-49cd-a2c0-5b3f9da06c59/volumes" Mar 19 09:54:30.968277 master-0 kubenswrapper[15202]: I0319 09:54:30.968209 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:54:31.006095 master-0 kubenswrapper[15202]: I0319 09:54:31.005806 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:54:31.013555 master-0 kubenswrapper[15202]: W0319 09:54:31.013477 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63107e92_79e8_45d6_af32_05d718986530.slice/crio-97e66ad0fd7429be175a5e1f30aef34f028005a62281cd1de51d26ab145c136b WatchSource:0}: Error finding container 97e66ad0fd7429be175a5e1f30aef34f028005a62281cd1de51d26ab145c136b: Status 404 returned error can't find the container with id 97e66ad0fd7429be175a5e1f30aef34f028005a62281cd1de51d26ab145c136b Mar 19 09:54:31.834745 master-0 kubenswrapper[15202]: I0319 09:54:31.834544 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"661ff0ec-2637-474a-b47e-658ac7e62908","Type":"ContainerStarted","Data":"26d7907ba61a7a8126bbfba8721c06e736177a4f37ab75b9f99bda68b65a8670"} Mar 19 09:54:31.834745 master-0 kubenswrapper[15202]: I0319 09:54:31.834632 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"661ff0ec-2637-474a-b47e-658ac7e62908","Type":"ContainerStarted","Data":"6ee1054590e14f7c3a468bc24ed24abce66aa478165c2b5aae4617d3b721d79e"} Mar 19 09:54:31.840922 master-0 kubenswrapper[15202]: I0319 09:54:31.840114 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63107e92-79e8-45d6-af32-05d718986530","Type":"ContainerStarted","Data":"b9a25273a5772ebdc7c882205c5ef1fb3b3a0e5db87dc0e4ea85165186e3f3cf"} Mar 19 09:54:31.840922 master-0 kubenswrapper[15202]: I0319 09:54:31.840156 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63107e92-79e8-45d6-af32-05d718986530","Type":"ContainerStarted","Data":"3b00444992c3977f931bbaad0fb765b4a5027f9a70a9486a42655490d8946542"} Mar 19 09:54:31.840922 master-0 kubenswrapper[15202]: I0319 09:54:31.840168 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63107e92-79e8-45d6-af32-05d718986530","Type":"ContainerStarted","Data":"97e66ad0fd7429be175a5e1f30aef34f028005a62281cd1de51d26ab145c136b"} Mar 19 09:54:31.875485 master-0 kubenswrapper[15202]: I0319 09:54:31.875308 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.875264095 podStartE2EDuration="2.875264095s" podCreationTimestamp="2026-03-19 09:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:31.857569858 +0000 UTC m=+1789.242984674" watchObservedRunningTime="2026-03-19 09:54:31.875264095 +0000 UTC m=+1789.260678941" Mar 19 09:54:31.912297 master-0 kubenswrapper[15202]: I0319 09:54:31.912160 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.912127178 podStartE2EDuration="2.912127178s" podCreationTimestamp="2026-03-19 09:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:31.90293289 +0000 UTC m=+1789.288347726" watchObservedRunningTime="2026-03-19 09:54:31.912127178 +0000 UTC m=+1789.297542014" Mar 19 09:54:35.346813 master-0 kubenswrapper[15202]: I0319 09:54:35.346758 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 09:54:38.222221 master-0 kubenswrapper[15202]: I0319 09:54:38.222111 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:54:38.222812 master-0 kubenswrapper[15202]: I0319 09:54:38.222249 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:54:39.238873 master-0 kubenswrapper[15202]: I0319 09:54:39.238774 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="17791259-f6c3-44a3-9ee1-f87b6c7db780" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:54:39.239546 master-0 kubenswrapper[15202]: I0319 09:54:39.238778 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="17791259-f6c3-44a3-9ee1-f87b6c7db780" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:54:40.323434 master-0 kubenswrapper[15202]: I0319 09:54:40.323349 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:54:40.323434 master-0 kubenswrapper[15202]: I0319 09:54:40.323418 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:54:40.346596 master-0 kubenswrapper[15202]: I0319 09:54:40.346521 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 09:54:40.386447 master-0 kubenswrapper[15202]: I0319 09:54:40.386380 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 09:54:41.004872 master-0 kubenswrapper[15202]: I0319 09:54:41.004742 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 09:54:41.341702 master-0 kubenswrapper[15202]: I0319 09:54:41.341627 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63107e92-79e8-45d6-af32-05d718986530" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:54:41.342394 master-0 kubenswrapper[15202]: I0319 09:54:41.341663 15202 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63107e92-79e8-45d6-af32-05d718986530" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:54:46.221670 master-0 kubenswrapper[15202]: I0319 09:54:46.221593 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:54:46.221670 master-0 kubenswrapper[15202]: I0319 09:54:46.221681 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:54:48.232703 master-0 kubenswrapper[15202]: I0319 09:54:48.232615 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:54:48.236039 master-0 kubenswrapper[15202]: I0319 09:54:48.235973 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:54:48.240559 master-0 kubenswrapper[15202]: I0319 09:54:48.240446 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:54:48.325460 master-0 kubenswrapper[15202]: I0319 09:54:48.323640 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:54:48.325460 master-0 kubenswrapper[15202]: I0319 09:54:48.323741 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:54:49.102141 master-0 kubenswrapper[15202]: I0319 09:54:49.102043 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:54:50.330075 master-0 kubenswrapper[15202]: I0319 09:54:50.329986 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:54:50.332290 master-0 kubenswrapper[15202]: I0319 09:54:50.332204 15202 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:54:50.344917 master-0 kubenswrapper[15202]: I0319 09:54:50.344848 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:54:51.125237 master-0 kubenswrapper[15202]: I0319 09:54:51.125142 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:56:02.782886 master-0 kubenswrapper[15202]: I0319 09:56:02.782785 15202 scope.go:117] "RemoveContainer" containerID="71bebb15c1a6dd7e020d7a54eff7e872970a7e0e20a5ff73cf24ccf5c304d482" Mar 19 09:57:02.913689 master-0 kubenswrapper[15202]: I0319 09:57:02.913409 15202 scope.go:117] "RemoveContainer" containerID="5a63a529310cf423d9ea418b0b8e064e9c7716971f0f72c8873aa5512c01c530" Mar 19 09:57:02.957935 master-0 kubenswrapper[15202]: I0319 09:57:02.957313 15202 scope.go:117] "RemoveContainer" containerID="9941eb25adc16021af551ad625b756cead27849d221b0f4deee8036d26ddd3fa" Mar 19 09:57:02.991727 master-0 kubenswrapper[15202]: I0319 09:57:02.991629 15202 scope.go:117] "RemoveContainer" containerID="a14e8229932890689469fd4553d6d7254e31b6aa71d9d75a552b4a62a0b99786" Mar 19 09:57:03.025963 master-0 kubenswrapper[15202]: I0319 09:57:03.025692 15202 scope.go:117] "RemoveContainer" containerID="c52b2aa7760d01a923982628c21e81cd4bdbb7bcc63829779172f81c54b5d42a" Mar 19 09:57:03.053656 master-0 kubenswrapper[15202]: I0319 09:57:03.053578 15202 scope.go:117] "RemoveContainer" containerID="4b46b44ed883040d9b5cc4af5e2c5fbbdb384f3131b50be056f41674e66144a8" Mar 19 09:58:03.201725 master-0 kubenswrapper[15202]: I0319 09:58:03.201657 15202 scope.go:117] "RemoveContainer" containerID="7ff529613299b924c3d4cb1d4031e6538c493fe9beb8cb333c50be6f14dacc6a" Mar 19 09:58:03.233678 master-0 kubenswrapper[15202]: I0319 09:58:03.233372 15202 scope.go:117] "RemoveContainer" containerID="006abe7cbc3153c692320c85322b89c89f63c8ad1b8605e8c10f9ff7418e02cf" Mar 19 09:59:51.076690 master-0 kubenswrapper[15202]: I0319 09:59:51.076607 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fc3e-account-create-update-btzjb"] Mar 19 09:59:51.091710 master-0 kubenswrapper[15202]: I0319 09:59:51.091634 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nj4vf"] Mar 19 09:59:51.103906 master-0 kubenswrapper[15202]: I0319 09:59:51.103819 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fc3e-account-create-update-btzjb"] Mar 19 09:59:51.116577 master-0 kubenswrapper[15202]: I0319 09:59:51.116442 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nj4vf"] Mar 19 09:59:52.844802 master-0 kubenswrapper[15202]: I0319 09:59:52.844658 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db9a43e6-7b74-426b-83ee-50df7a0270e7" path="/var/lib/kubelet/pods/db9a43e6-7b74-426b-83ee-50df7a0270e7/volumes" Mar 19 09:59:52.847917 master-0 kubenswrapper[15202]: I0319 09:59:52.847860 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817" path="/var/lib/kubelet/pods/ff3d2b64-e9bb-4ddb-8e0f-a730c27a1817/volumes" Mar 19 09:59:59.084991 master-0 kubenswrapper[15202]: I0319 09:59:59.084872 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f95e-account-create-update-gph65"] Mar 19 09:59:59.113707 master-0 kubenswrapper[15202]: I0319 09:59:59.112990 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wpbkz"] Mar 19 09:59:59.134426 master-0 kubenswrapper[15202]: I0319 09:59:59.134327 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5bqq7"] Mar 19 09:59:59.151636 master-0 kubenswrapper[15202]: I0319 09:59:59.151530 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-16fb-account-create-update-8cp5c"] Mar 19 09:59:59.167654 master-0 kubenswrapper[15202]: I0319 09:59:59.167590 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f95e-account-create-update-gph65"] Mar 19 09:59:59.187377 master-0 kubenswrapper[15202]: I0319 09:59:59.186660 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-16fb-account-create-update-8cp5c"] Mar 19 09:59:59.202842 master-0 kubenswrapper[15202]: I0319 09:59:59.202776 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wpbkz"] Mar 19 09:59:59.215920 master-0 kubenswrapper[15202]: I0319 09:59:59.215803 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5bqq7"] Mar 19 10:00:00.861362 master-0 kubenswrapper[15202]: I0319 10:00:00.861201 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b34559e-c4bb-497f-9659-c0179f8010be" path="/var/lib/kubelet/pods/4b34559e-c4bb-497f-9659-c0179f8010be/volumes" Mar 19 10:00:00.864751 master-0 kubenswrapper[15202]: I0319 10:00:00.864662 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cac917-044f-4346-81f4-336f070d2557" path="/var/lib/kubelet/pods/68cac917-044f-4346-81f4-336f070d2557/volumes" Mar 19 10:00:00.868558 master-0 kubenswrapper[15202]: I0319 10:00:00.868514 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19f30b4-71bc-486c-9fa7-306c8dac619b" path="/var/lib/kubelet/pods/b19f30b4-71bc-486c-9fa7-306c8dac619b/volumes" Mar 19 10:00:00.871451 master-0 kubenswrapper[15202]: I0319 10:00:00.871409 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6308b84-19d0-444a-9ef6-6ab67792a0c5" path="/var/lib/kubelet/pods/b6308b84-19d0-444a-9ef6-6ab67792a0c5/volumes" Mar 19 10:00:03.335012 master-0 kubenswrapper[15202]: I0319 10:00:03.334852 15202 scope.go:117] "RemoveContainer" containerID="23dc01441218bd4c719812d52d08d2e2a21ed5b494082120e597b96b806fb5ab" Mar 19 10:00:03.376586 master-0 kubenswrapper[15202]: I0319 10:00:03.375817 15202 scope.go:117] "RemoveContainer" containerID="bf9280325357452e6fda657e1d35b0337731a76ba60bf9ce428f7c001f2e72d5" Mar 19 10:00:03.422559 master-0 kubenswrapper[15202]: I0319 10:00:03.422446 15202 scope.go:117] "RemoveContainer" containerID="4576df9849ff0c12cbc12baac98225d3548ba664159881c63610cbe608e13714" Mar 19 10:00:03.456984 master-0 kubenswrapper[15202]: I0319 10:00:03.456918 15202 scope.go:117] "RemoveContainer" containerID="482c2e0767034c6d9ed269e91940e6a2f51314a39b85e17162a1995591d2d1f8" Mar 19 10:00:03.505609 master-0 kubenswrapper[15202]: I0319 10:00:03.505472 15202 scope.go:117] "RemoveContainer" containerID="30e76772395e2cd00d44891ebcc7926f70b32eb387104ca51f9cd09acf540dcc" Mar 19 10:00:03.549032 master-0 kubenswrapper[15202]: I0319 10:00:03.544869 15202 scope.go:117] "RemoveContainer" containerID="d9bce657371028171095e39d0d257d8bf1207d54602f1af9a00652c4bfc9ae5a" Mar 19 10:00:11.425955 master-0 kubenswrapper[15202]: I0319 10:00:11.425862 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dh5fs"] Mar 19 10:00:11.450796 master-0 kubenswrapper[15202]: I0319 10:00:11.450710 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dh5fs"] Mar 19 10:00:12.836645 master-0 kubenswrapper[15202]: I0319 10:00:12.836562 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b308d6d-e494-4e76-8b8f-5c661340666b" path="/var/lib/kubelet/pods/7b308d6d-e494-4e76-8b8f-5c661340666b/volumes" Mar 19 10:00:22.217978 master-0 kubenswrapper[15202]: I0319 10:00:22.217886 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8nppp"] Mar 19 10:00:22.240802 master-0 kubenswrapper[15202]: I0319 10:00:22.240706 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wjhhn"] Mar 19 10:00:22.418714 master-0 kubenswrapper[15202]: I0319 10:00:22.418515 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3735-account-create-update-59xbx"] Mar 19 10:00:22.433825 master-0 kubenswrapper[15202]: I0319 10:00:22.433716 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-886c-account-create-update-24ntn"] Mar 19 10:00:22.447076 master-0 kubenswrapper[15202]: I0319 10:00:22.446998 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wjhhn"] Mar 19 10:00:22.536139 master-0 kubenswrapper[15202]: I0319 10:00:22.535868 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8nppp"] Mar 19 10:00:22.549964 master-0 kubenswrapper[15202]: I0319 10:00:22.549886 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3735-account-create-update-59xbx"] Mar 19 10:00:22.565072 master-0 kubenswrapper[15202]: I0319 10:00:22.565002 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-886c-account-create-update-24ntn"] Mar 19 10:00:22.833734 master-0 kubenswrapper[15202]: I0319 10:00:22.833579 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310506d0-94fa-4573-a29a-5ddd012b6e64" path="/var/lib/kubelet/pods/310506d0-94fa-4573-a29a-5ddd012b6e64/volumes" Mar 19 10:00:22.836829 master-0 kubenswrapper[15202]: I0319 10:00:22.836790 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c485a6b-17be-4afe-8110-a57f77347be1" path="/var/lib/kubelet/pods/3c485a6b-17be-4afe-8110-a57f77347be1/volumes" Mar 19 10:00:22.839080 master-0 kubenswrapper[15202]: I0319 10:00:22.839043 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482a5739-51df-41c5-89b0-8ea2e82cee8a" path="/var/lib/kubelet/pods/482a5739-51df-41c5-89b0-8ea2e82cee8a/volumes" Mar 19 10:00:22.840526 master-0 kubenswrapper[15202]: I0319 10:00:22.840440 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592e718b-4b77-4ef3-8ee0-e7ce98415e3c" path="/var/lib/kubelet/pods/592e718b-4b77-4ef3-8ee0-e7ce98415e3c/volumes" Mar 19 10:00:25.099985 master-0 kubenswrapper[15202]: I0319 10:00:25.099895 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zxw2c"] Mar 19 10:00:25.115509 master-0 kubenswrapper[15202]: I0319 10:00:25.115381 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zxw2c"] Mar 19 10:00:26.843282 master-0 kubenswrapper[15202]: I0319 10:00:26.843123 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9bce886-0af6-432e-ab68-b4af30a4defd" path="/var/lib/kubelet/pods/c9bce886-0af6-432e-ab68-b4af30a4defd/volumes" Mar 19 10:00:29.150754 master-0 kubenswrapper[15202]: I0319 10:00:29.150683 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vk8gz"] Mar 19 10:00:29.177397 master-0 kubenswrapper[15202]: I0319 10:00:29.177334 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vk8gz"] Mar 19 10:00:30.826952 master-0 kubenswrapper[15202]: I0319 10:00:30.826857 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fefe00c-9546-4205-a4e5-a73e807d6bf4" path="/var/lib/kubelet/pods/9fefe00c-9546-4205-a4e5-a73e807d6bf4/volumes" Mar 19 10:01:00.194996 master-0 kubenswrapper[15202]: I0319 10:01:00.194309 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565241-vpcdg"] Mar 19 10:01:00.196092 master-0 kubenswrapper[15202]: I0319 10:01:00.196050 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.229904 master-0 kubenswrapper[15202]: I0319 10:01:00.229780 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565241-vpcdg"] Mar 19 10:01:00.366679 master-0 kubenswrapper[15202]: I0319 10:01:00.366567 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-combined-ca-bundle\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.367428 master-0 kubenswrapper[15202]: I0319 10:01:00.367310 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-fernet-keys\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.367774 master-0 kubenswrapper[15202]: I0319 10:01:00.367725 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-config-data\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.367849 master-0 kubenswrapper[15202]: I0319 10:01:00.367817 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bmrg\" (UniqueName: \"kubernetes.io/projected/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-kube-api-access-5bmrg\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.470733 master-0 kubenswrapper[15202]: I0319 10:01:00.470567 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-fernet-keys\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.470733 master-0 kubenswrapper[15202]: I0319 10:01:00.470663 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-config-data\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.470733 master-0 kubenswrapper[15202]: I0319 10:01:00.470692 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bmrg\" (UniqueName: \"kubernetes.io/projected/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-kube-api-access-5bmrg\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.471037 master-0 kubenswrapper[15202]: I0319 10:01:00.470785 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-combined-ca-bundle\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.475099 master-0 kubenswrapper[15202]: I0319 10:01:00.475064 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-combined-ca-bundle\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.475418 master-0 kubenswrapper[15202]: I0319 10:01:00.475337 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-config-data\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.476879 master-0 kubenswrapper[15202]: I0319 10:01:00.476825 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-fernet-keys\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.494866 master-0 kubenswrapper[15202]: I0319 10:01:00.494630 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bmrg\" (UniqueName: \"kubernetes.io/projected/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-kube-api-access-5bmrg\") pod \"keystone-cron-29565241-vpcdg\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:00.540343 master-0 kubenswrapper[15202]: I0319 10:01:00.540242 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:01.053915 master-0 kubenswrapper[15202]: I0319 10:01:01.053811 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565241-vpcdg"] Mar 19 10:01:01.427274 master-0 kubenswrapper[15202]: I0319 10:01:01.427100 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-vpcdg" event={"ID":"aa3b5bce-9cb9-47f1-9b40-238df5b3a007","Type":"ContainerStarted","Data":"ea6841a0120f9143b8087012170c48ef723d9d2eee2fd17ded6c41bf1bbc9eb4"} Mar 19 10:01:01.427274 master-0 kubenswrapper[15202]: I0319 10:01:01.427167 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-vpcdg" event={"ID":"aa3b5bce-9cb9-47f1-9b40-238df5b3a007","Type":"ContainerStarted","Data":"651a3eef604d9210d8ef3fa178ffc03b4493864078acc30f72a1db6e012d6a2d"} Mar 19 10:01:02.173064 master-0 kubenswrapper[15202]: I0319 10:01:02.172888 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565241-vpcdg" podStartSLOduration=2.172856011 podStartE2EDuration="2.172856011s" podCreationTimestamp="2026-03-19 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:01:01.450308747 +0000 UTC m=+2178.835723563" watchObservedRunningTime="2026-03-19 10:01:02.172856011 +0000 UTC m=+2179.558270827" Mar 19 10:01:02.185258 master-0 kubenswrapper[15202]: I0319 10:01:02.185152 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2flmr"] Mar 19 10:01:02.209523 master-0 kubenswrapper[15202]: I0319 10:01:02.209394 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2flmr"] Mar 19 10:01:02.837400 master-0 kubenswrapper[15202]: I0319 10:01:02.837305 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ab6d34-c24f-4e22-ac73-939b5a791240" path="/var/lib/kubelet/pods/85ab6d34-c24f-4e22-ac73-939b5a791240/volumes" Mar 19 10:01:03.740537 master-0 kubenswrapper[15202]: I0319 10:01:03.740481 15202 scope.go:117] "RemoveContainer" containerID="1889dd316ddb14048830e434954b638b78e3749d2615c2a488f5b1ea38ff640c" Mar 19 10:01:03.768457 master-0 kubenswrapper[15202]: I0319 10:01:03.768409 15202 scope.go:117] "RemoveContainer" containerID="9d9eca4b3f22a2760500a2b0f6800524d7e8068e31b227e0c275e5169dc7a91d" Mar 19 10:01:03.787955 master-0 kubenswrapper[15202]: I0319 10:01:03.787603 15202 scope.go:117] "RemoveContainer" containerID="c1b707bc875ad8212e30e4e8f547814d3aaefa608648a913c592528a62709843" Mar 19 10:01:03.835030 master-0 kubenswrapper[15202]: I0319 10:01:03.832789 15202 scope.go:117] "RemoveContainer" containerID="4fcad111ea9d40bd2336613c4152e6d793f8b7459f474e8bfe24622587250403" Mar 19 10:01:03.885478 master-0 kubenswrapper[15202]: I0319 10:01:03.885417 15202 scope.go:117] "RemoveContainer" containerID="c62c66022b5964dfe35ac5ca7b1a14f92fc70df12e6a5bfbd36cad4cccb3c538" Mar 19 10:01:03.933295 master-0 kubenswrapper[15202]: I0319 10:01:03.933219 15202 scope.go:117] "RemoveContainer" containerID="fd52e98f6b5e95065c9e493daa1210887d0bb74539312b3915d6c7cf6725b9d7" Mar 19 10:01:03.964812 master-0 kubenswrapper[15202]: I0319 10:01:03.964750 15202 scope.go:117] "RemoveContainer" containerID="3efb915151e6ca3740a776363e7453d24670d0b9cd300ddafeee1f941012cc5f" Mar 19 10:01:03.995158 master-0 kubenswrapper[15202]: I0319 10:01:03.995111 15202 scope.go:117] "RemoveContainer" containerID="16cfa87650ea68a18704ad490c72a2cc838c467216e77480298c8d54d6c0a2b3" Mar 19 10:01:04.471278 master-0 kubenswrapper[15202]: I0319 10:01:04.471173 15202 generic.go:334] "Generic (PLEG): container finished" podID="aa3b5bce-9cb9-47f1-9b40-238df5b3a007" containerID="ea6841a0120f9143b8087012170c48ef723d9d2eee2fd17ded6c41bf1bbc9eb4" exitCode=0 Mar 19 10:01:04.471278 master-0 kubenswrapper[15202]: I0319 10:01:04.471268 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-vpcdg" event={"ID":"aa3b5bce-9cb9-47f1-9b40-238df5b3a007","Type":"ContainerDied","Data":"ea6841a0120f9143b8087012170c48ef723d9d2eee2fd17ded6c41bf1bbc9eb4"} Mar 19 10:01:06.029415 master-0 kubenswrapper[15202]: I0319 10:01:06.029351 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:06.175523 master-0 kubenswrapper[15202]: I0319 10:01:06.175320 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-combined-ca-bundle\") pod \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " Mar 19 10:01:06.175523 master-0 kubenswrapper[15202]: I0319 10:01:06.175452 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-fernet-keys\") pod \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " Mar 19 10:01:06.175749 master-0 kubenswrapper[15202]: I0319 10:01:06.175565 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-config-data\") pod \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " Mar 19 10:01:06.175848 master-0 kubenswrapper[15202]: I0319 10:01:06.175814 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bmrg\" (UniqueName: \"kubernetes.io/projected/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-kube-api-access-5bmrg\") pod \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\" (UID: \"aa3b5bce-9cb9-47f1-9b40-238df5b3a007\") " Mar 19 10:01:06.179155 master-0 kubenswrapper[15202]: I0319 10:01:06.179108 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "aa3b5bce-9cb9-47f1-9b40-238df5b3a007" (UID: "aa3b5bce-9cb9-47f1-9b40-238df5b3a007"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:06.182665 master-0 kubenswrapper[15202]: I0319 10:01:06.182294 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-kube-api-access-5bmrg" (OuterVolumeSpecName: "kube-api-access-5bmrg") pod "aa3b5bce-9cb9-47f1-9b40-238df5b3a007" (UID: "aa3b5bce-9cb9-47f1-9b40-238df5b3a007"). InnerVolumeSpecName "kube-api-access-5bmrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:01:06.215303 master-0 kubenswrapper[15202]: I0319 10:01:06.214595 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa3b5bce-9cb9-47f1-9b40-238df5b3a007" (UID: "aa3b5bce-9cb9-47f1-9b40-238df5b3a007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:06.237746 master-0 kubenswrapper[15202]: I0319 10:01:06.237651 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-config-data" (OuterVolumeSpecName: "config-data") pod "aa3b5bce-9cb9-47f1-9b40-238df5b3a007" (UID: "aa3b5bce-9cb9-47f1-9b40-238df5b3a007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:06.280195 master-0 kubenswrapper[15202]: I0319 10:01:06.280114 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bmrg\" (UniqueName: \"kubernetes.io/projected/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-kube-api-access-5bmrg\") on node \"master-0\" DevicePath \"\"" Mar 19 10:01:06.280195 master-0 kubenswrapper[15202]: I0319 10:01:06.280200 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 10:01:06.280339 master-0 kubenswrapper[15202]: I0319 10:01:06.280213 15202 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 10:01:06.280339 master-0 kubenswrapper[15202]: I0319 10:01:06.280224 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b5bce-9cb9-47f1-9b40-238df5b3a007-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 10:01:06.512757 master-0 kubenswrapper[15202]: I0319 10:01:06.511855 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-vpcdg" event={"ID":"aa3b5bce-9cb9-47f1-9b40-238df5b3a007","Type":"ContainerDied","Data":"651a3eef604d9210d8ef3fa178ffc03b4493864078acc30f72a1db6e012d6a2d"} Mar 19 10:01:06.512757 master-0 kubenswrapper[15202]: I0319 10:01:06.511953 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651a3eef604d9210d8ef3fa178ffc03b4493864078acc30f72a1db6e012d6a2d" Mar 19 10:01:06.512757 master-0 kubenswrapper[15202]: I0319 10:01:06.512073 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-vpcdg" Mar 19 10:01:09.037560 master-0 kubenswrapper[15202]: I0319 10:01:09.037441 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hbzpf"] Mar 19 10:01:09.053907 master-0 kubenswrapper[15202]: I0319 10:01:09.053793 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hbzpf"] Mar 19 10:01:10.046500 master-0 kubenswrapper[15202]: I0319 10:01:10.046358 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xl426"] Mar 19 10:01:10.056987 master-0 kubenswrapper[15202]: I0319 10:01:10.056908 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xl426"] Mar 19 10:01:10.829955 master-0 kubenswrapper[15202]: I0319 10:01:10.829230 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972a5655-2953-4875-b9cd-2b5481c6ff30" path="/var/lib/kubelet/pods/972a5655-2953-4875-b9cd-2b5481c6ff30/volumes" Mar 19 10:01:10.831856 master-0 kubenswrapper[15202]: I0319 10:01:10.831806 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8782ab3-387f-49a1-94ae-46ba9f1e4241" path="/var/lib/kubelet/pods/d8782ab3-387f-49a1-94ae-46ba9f1e4241/volumes" Mar 19 10:01:12.045280 master-0 kubenswrapper[15202]: I0319 10:01:12.044935 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ba05-db-sync-jdc2m"] Mar 19 10:01:12.065855 master-0 kubenswrapper[15202]: I0319 10:01:12.065792 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ba05-db-sync-jdc2m"] Mar 19 10:01:12.866641 master-0 kubenswrapper[15202]: I0319 10:01:12.866570 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e14c4d-4d08-4c3c-8803-d39b03125169" path="/var/lib/kubelet/pods/04e14c4d-4d08-4c3c-8803-d39b03125169/volumes" Mar 19 10:01:44.061741 master-0 kubenswrapper[15202]: I0319 10:01:44.061621 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-lfsjb"] Mar 19 10:01:44.076528 master-0 kubenswrapper[15202]: I0319 10:01:44.074191 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-lfsjb"] Mar 19 10:01:44.834408 master-0 kubenswrapper[15202]: I0319 10:01:44.834278 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d2318b-1a1b-45a3-9b06-ea750daf9e17" path="/var/lib/kubelet/pods/18d2318b-1a1b-45a3-9b06-ea750daf9e17/volumes" Mar 19 10:01:45.055701 master-0 kubenswrapper[15202]: I0319 10:01:45.055620 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-x7j8z"] Mar 19 10:01:45.069949 master-0 kubenswrapper[15202]: I0319 10:01:45.069861 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-x7j8z"] Mar 19 10:01:46.826900 master-0 kubenswrapper[15202]: I0319 10:01:46.826795 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d49e40e-907d-47f6-b07e-1cf72ec3f3a9" path="/var/lib/kubelet/pods/8d49e40e-907d-47f6-b07e-1cf72ec3f3a9/volumes" Mar 19 10:02:04.333286 master-0 kubenswrapper[15202]: I0319 10:02:04.333165 15202 scope.go:117] "RemoveContainer" containerID="42ca44b5febf4f3dca1cd9cb5cb5b30c9ac65c7ccd22ac6082676dc8fbf27bd5" Mar 19 10:02:04.374716 master-0 kubenswrapper[15202]: I0319 10:02:04.374512 15202 scope.go:117] "RemoveContainer" containerID="62d661213ddc7402b16f43961ab0ddd88189e01cab778f58d3c4309a43d8ab8e" Mar 19 10:02:04.414880 master-0 kubenswrapper[15202]: I0319 10:02:04.414796 15202 scope.go:117] "RemoveContainer" containerID="f6ac93c4bee5849db982df9b62b9dcdfee2899a896588d3b6aa0b6351cc21e33" Mar 19 10:02:04.455813 master-0 kubenswrapper[15202]: I0319 10:02:04.455720 15202 scope.go:117] "RemoveContainer" containerID="6350fb831fcd1852ae42fd0cac2ad4b82126f840a2f7c9cf75aed041b345afe0" Mar 19 10:02:04.504040 master-0 kubenswrapper[15202]: I0319 10:02:04.503966 15202 scope.go:117] "RemoveContainer" containerID="b351eff4c20bb908b338fde6cdda00d87e674e8f9831fe42719e7a444cd88df8" Mar 19 10:02:04.542820 master-0 kubenswrapper[15202]: I0319 10:02:04.542649 15202 scope.go:117] "RemoveContainer" containerID="ebf23344f287355631505c388ebb7641a50e86eb695575058db26a33e7064584" Mar 19 10:02:04.566709 master-0 kubenswrapper[15202]: I0319 10:02:04.566649 15202 scope.go:117] "RemoveContainer" containerID="c38e638a8d646d2bdcd40ecea57b0e7de5a5b1776624134656b4f20ac203abd8" Mar 19 10:02:18.063572 master-0 kubenswrapper[15202]: I0319 10:02:18.063505 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-j7tk6"] Mar 19 10:02:18.076888 master-0 kubenswrapper[15202]: I0319 10:02:18.076801 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-j7tk6"] Mar 19 10:02:18.828718 master-0 kubenswrapper[15202]: I0319 10:02:18.828648 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9856624e-09c3-4b1c-bf33-3f57ba441335" path="/var/lib/kubelet/pods/9856624e-09c3-4b1c-bf33-3f57ba441335/volumes" Mar 19 10:02:26.051072 master-0 kubenswrapper[15202]: I0319 10:02:26.051011 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8395-account-create-update-8nzrz"] Mar 19 10:02:26.063628 master-0 kubenswrapper[15202]: I0319 10:02:26.063543 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8395-account-create-update-8nzrz"] Mar 19 10:02:26.839796 master-0 kubenswrapper[15202]: I0319 10:02:26.839666 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c95d54-8750-4daf-a981-9544bc6b1fc7" path="/var/lib/kubelet/pods/87c95d54-8750-4daf-a981-9544bc6b1fc7/volumes" Mar 19 10:02:28.033457 master-0 kubenswrapper[15202]: I0319 10:02:28.033396 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zlb6q"] Mar 19 10:02:28.042881 master-0 kubenswrapper[15202]: I0319 10:02:28.042799 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zlb6q"] Mar 19 10:02:28.845460 master-0 kubenswrapper[15202]: I0319 10:02:28.845377 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcea5022-9090-4c15-8a38-91e20e844584" path="/var/lib/kubelet/pods/fcea5022-9090-4c15-8a38-91e20e844584/volumes" Mar 19 10:02:29.034718 master-0 kubenswrapper[15202]: I0319 10:02:29.034626 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jshcc"] Mar 19 10:02:29.048937 master-0 kubenswrapper[15202]: I0319 10:02:29.048856 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jshcc"] Mar 19 10:02:30.042439 master-0 kubenswrapper[15202]: I0319 10:02:30.042370 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5ec6-account-create-update-fn8fv"] Mar 19 10:02:30.057243 master-0 kubenswrapper[15202]: I0319 10:02:30.057168 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5ec6-account-create-update-fn8fv"] Mar 19 10:02:30.826743 master-0 kubenswrapper[15202]: I0319 10:02:30.826676 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db" path="/var/lib/kubelet/pods/d7a90d01-cab3-407a-b0bd-bfcaf5ebc9db/volumes" Mar 19 10:02:30.827582 master-0 kubenswrapper[15202]: I0319 10:02:30.827555 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2142cc2-3e56-4ff5-b467-b79d4a99c56c" path="/var/lib/kubelet/pods/f2142cc2-3e56-4ff5-b467-b79d4a99c56c/volumes" Mar 19 10:02:36.060744 master-0 kubenswrapper[15202]: I0319 10:02:36.060640 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ea37-account-create-update-c8nf7"] Mar 19 10:02:36.076517 master-0 kubenswrapper[15202]: I0319 10:02:36.076377 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ea37-account-create-update-c8nf7"] Mar 19 10:02:36.837166 master-0 kubenswrapper[15202]: I0319 10:02:36.837002 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6898ed5b-562b-415f-93f6-ddf0c1e01558" path="/var/lib/kubelet/pods/6898ed5b-562b-415f-93f6-ddf0c1e01558/volumes" Mar 19 10:03:04.731943 master-0 kubenswrapper[15202]: I0319 10:03:04.731768 15202 scope.go:117] "RemoveContainer" containerID="3c38d79e2f1fc8e907a019093506e8a3a69aa6aa80d90fa80be9a09b4cf958c2" Mar 19 10:03:04.760995 master-0 kubenswrapper[15202]: I0319 10:03:04.760921 15202 scope.go:117] "RemoveContainer" containerID="4018b0ae4fb460db1cbc894596cf06ad4ce784a13810c83cbfd3e89536b62b70" Mar 19 10:03:04.789945 master-0 kubenswrapper[15202]: I0319 10:03:04.789882 15202 scope.go:117] "RemoveContainer" containerID="59eab6d4d9d8948434477e98def35f77419843608afebc02befd0c1d78c60024" Mar 19 10:03:04.827679 master-0 kubenswrapper[15202]: I0319 10:03:04.827618 15202 scope.go:117] "RemoveContainer" containerID="84a5b0ed57e92810141c8a1b5a49d2f7d9b9902da916052304c5505bf8eabb4b" Mar 19 10:03:04.856868 master-0 kubenswrapper[15202]: I0319 10:03:04.855564 15202 scope.go:117] "RemoveContainer" containerID="355c6401912de6e8deb25865a1a9284a576d2088603c72a2deeb1812456a827a" Mar 19 10:03:04.879629 master-0 kubenswrapper[15202]: I0319 10:03:04.879544 15202 scope.go:117] "RemoveContainer" containerID="ab0b41eddb547d86b9031856e7c1c3966231fef520771fc062128a8713086438" Mar 19 10:03:09.071480 master-0 kubenswrapper[15202]: I0319 10:03:09.071372 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x9mns"] Mar 19 10:03:09.084689 master-0 kubenswrapper[15202]: I0319 10:03:09.084578 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x9mns"] Mar 19 10:03:10.827331 master-0 kubenswrapper[15202]: I0319 10:03:10.827259 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6" path="/var/lib/kubelet/pods/89e8b5d3-fcbe-42a0-bad7-a7cd30e6c2d6/volumes" Mar 19 10:03:32.082822 master-0 kubenswrapper[15202]: I0319 10:03:32.081282 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-t8sfd"] Mar 19 10:03:32.095998 master-0 kubenswrapper[15202]: I0319 10:03:32.095917 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-t8sfd"] Mar 19 10:03:32.834794 master-0 kubenswrapper[15202]: I0319 10:03:32.834717 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3" path="/var/lib/kubelet/pods/a1e9fe9a-8318-49fe-a5c6-b01a9737d2e3/volumes" Mar 19 10:03:38.083615 master-0 kubenswrapper[15202]: I0319 10:03:38.083540 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s6rxr"] Mar 19 10:03:38.134620 master-0 kubenswrapper[15202]: I0319 10:03:38.126773 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s6rxr"] Mar 19 10:03:38.829015 master-0 kubenswrapper[15202]: I0319 10:03:38.828926 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499db210-7cab-4a33-99b1-3be10260b2c2" path="/var/lib/kubelet/pods/499db210-7cab-4a33-99b1-3be10260b2c2/volumes" Mar 19 10:04:05.070772 master-0 kubenswrapper[15202]: I0319 10:04:05.070695 15202 scope.go:117] "RemoveContainer" containerID="26b548981a81bd303ac44e68210e5e48a84f86c169d173e4646aa80b4f6775c4" Mar 19 10:04:05.108255 master-0 kubenswrapper[15202]: I0319 10:04:05.108087 15202 scope.go:117] "RemoveContainer" containerID="7a3db32261122d75f83eb172b197008388cf30d3d9bff2ad096104ff269a2851" Mar 19 10:04:05.144349 master-0 kubenswrapper[15202]: I0319 10:04:05.144289 15202 scope.go:117] "RemoveContainer" containerID="e6bc5b6305aa59c429b80b85e16d99b583ea566e1a6003529d789d35901aa80e" Mar 19 10:04:22.066147 master-0 kubenswrapper[15202]: I0319 10:04:22.065536 15202 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-nzzbt"] Mar 19 10:04:22.078319 master-0 kubenswrapper[15202]: I0319 10:04:22.078167 15202 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-nzzbt"] Mar 19 10:04:22.826927 master-0 kubenswrapper[15202]: I0319 10:04:22.826868 15202 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5132911-1eb7-4712-b4ed-b7a745d2c36b" path="/var/lib/kubelet/pods/c5132911-1eb7-4712-b4ed-b7a745d2c36b/volumes" Mar 19 10:05:05.248562 master-0 kubenswrapper[15202]: I0319 10:05:05.248497 15202 scope.go:117] "RemoveContainer" containerID="498416a58915bdf931863577df951e3bcd558b68eaccae8e56c46022003dd2e2" Mar 19 10:19:53.023930 master-0 kubenswrapper[15202]: I0319 10:19:53.023786 15202 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-77dc968fc8-nnkkj" podUID="180cd549-4f02-4a40-875d-5d44423f0b2f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 19 10:29:58.612073 master-0 kubenswrapper[15202]: I0319 10:29:58.611882 15202 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-6888856db4-mgklh" podUID="fca9305a-b6de-4bca-a3cc-647f0a3bd7ad" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.128.0.137:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 11:01:00.359546 master-0 kubenswrapper[15202]: I0319 11:01:00.359322 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565301-skqb8"] Mar 19 11:01:00.365791 master-0 kubenswrapper[15202]: E0319 11:01:00.365714 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3b5bce-9cb9-47f1-9b40-238df5b3a007" containerName="keystone-cron" Mar 19 11:01:00.365791 master-0 kubenswrapper[15202]: I0319 11:01:00.365780 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3b5bce-9cb9-47f1-9b40-238df5b3a007" containerName="keystone-cron" Mar 19 11:01:00.367404 master-0 kubenswrapper[15202]: I0319 11:01:00.367354 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa3b5bce-9cb9-47f1-9b40-238df5b3a007" containerName="keystone-cron" Mar 19 11:01:00.370900 master-0 kubenswrapper[15202]: I0319 11:01:00.370849 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.478773 master-0 kubenswrapper[15202]: I0319 11:01:00.478695 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565301-skqb8"] Mar 19 11:01:00.524044 master-0 kubenswrapper[15202]: I0319 11:01:00.523971 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-fernet-keys\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.524285 master-0 kubenswrapper[15202]: I0319 11:01:00.524076 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-config-data\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.524285 master-0 kubenswrapper[15202]: I0319 11:01:00.524106 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-combined-ca-bundle\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.524285 master-0 kubenswrapper[15202]: I0319 11:01:00.524158 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m9wn\" (UniqueName: \"kubernetes.io/projected/29d381c7-25dc-4664-b653-142b34e3188d-kube-api-access-5m9wn\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.627242 master-0 kubenswrapper[15202]: I0319 11:01:00.627106 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-fernet-keys\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.627636 master-0 kubenswrapper[15202]: I0319 11:01:00.627611 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-combined-ca-bundle\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.627805 master-0 kubenswrapper[15202]: I0319 11:01:00.627786 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-config-data\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.627928 master-0 kubenswrapper[15202]: I0319 11:01:00.627912 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m9wn\" (UniqueName: \"kubernetes.io/projected/29d381c7-25dc-4664-b653-142b34e3188d-kube-api-access-5m9wn\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.633577 master-0 kubenswrapper[15202]: I0319 11:01:00.633517 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-fernet-keys\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.634690 master-0 kubenswrapper[15202]: I0319 11:01:00.634634 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-combined-ca-bundle\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.637247 master-0 kubenswrapper[15202]: I0319 11:01:00.637186 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-config-data\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.647322 master-0 kubenswrapper[15202]: I0319 11:01:00.647239 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m9wn\" (UniqueName: \"kubernetes.io/projected/29d381c7-25dc-4664-b653-142b34e3188d-kube-api-access-5m9wn\") pod \"keystone-cron-29565301-skqb8\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:00.697143 master-0 kubenswrapper[15202]: I0319 11:01:00.697054 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:01.547310 master-0 kubenswrapper[15202]: I0319 11:01:01.547210 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565301-skqb8"] Mar 19 11:01:01.635545 master-0 kubenswrapper[15202]: I0319 11:01:01.635451 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-skqb8" event={"ID":"29d381c7-25dc-4664-b653-142b34e3188d","Type":"ContainerStarted","Data":"4dcba996c76d0b4ca01cdfc08ea305cc23fe5fdf801bdf5f282bebf1fcefdfd4"} Mar 19 11:01:02.650166 master-0 kubenswrapper[15202]: I0319 11:01:02.650096 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-skqb8" event={"ID":"29d381c7-25dc-4664-b653-142b34e3188d","Type":"ContainerStarted","Data":"16d6654539b05a47e68deddf156e099ab335fb5393dce000bf42e4951033ac5b"} Mar 19 11:01:02.681003 master-0 kubenswrapper[15202]: I0319 11:01:02.680908 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565301-skqb8" podStartSLOduration=2.680814533 podStartE2EDuration="2.680814533s" podCreationTimestamp="2026-03-19 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:01:02.671653328 +0000 UTC m=+5780.057068164" watchObservedRunningTime="2026-03-19 11:01:02.680814533 +0000 UTC m=+5780.066229359" Mar 19 11:01:05.685456 master-0 kubenswrapper[15202]: I0319 11:01:05.685395 15202 generic.go:334] "Generic (PLEG): container finished" podID="29d381c7-25dc-4664-b653-142b34e3188d" containerID="16d6654539b05a47e68deddf156e099ab335fb5393dce000bf42e4951033ac5b" exitCode=0 Mar 19 11:01:05.685456 master-0 kubenswrapper[15202]: I0319 11:01:05.685452 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-skqb8" event={"ID":"29d381c7-25dc-4664-b653-142b34e3188d","Type":"ContainerDied","Data":"16d6654539b05a47e68deddf156e099ab335fb5393dce000bf42e4951033ac5b"} Mar 19 11:01:07.132222 master-0 kubenswrapper[15202]: I0319 11:01:07.132165 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:01:07.237599 master-0 kubenswrapper[15202]: I0319 11:01:07.237498 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-config-data\") pod \"29d381c7-25dc-4664-b653-142b34e3188d\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " Mar 19 11:01:07.237940 master-0 kubenswrapper[15202]: I0319 11:01:07.237667 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m9wn\" (UniqueName: \"kubernetes.io/projected/29d381c7-25dc-4664-b653-142b34e3188d-kube-api-access-5m9wn\") pod \"29d381c7-25dc-4664-b653-142b34e3188d\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " Mar 19 11:01:07.237940 master-0 kubenswrapper[15202]: I0319 11:01:07.237727 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-combined-ca-bundle\") pod \"29d381c7-25dc-4664-b653-142b34e3188d\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " Mar 19 11:01:07.237940 master-0 kubenswrapper[15202]: I0319 11:01:07.237766 15202 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-fernet-keys\") pod \"29d381c7-25dc-4664-b653-142b34e3188d\" (UID: \"29d381c7-25dc-4664-b653-142b34e3188d\") " Mar 19 11:01:07.255577 master-0 kubenswrapper[15202]: I0319 11:01:07.255410 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d381c7-25dc-4664-b653-142b34e3188d-kube-api-access-5m9wn" (OuterVolumeSpecName: "kube-api-access-5m9wn") pod "29d381c7-25dc-4664-b653-142b34e3188d" (UID: "29d381c7-25dc-4664-b653-142b34e3188d"). InnerVolumeSpecName "kube-api-access-5m9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:01:07.256886 master-0 kubenswrapper[15202]: I0319 11:01:07.256828 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "29d381c7-25dc-4664-b653-142b34e3188d" (UID: "29d381c7-25dc-4664-b653-142b34e3188d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:07.294386 master-0 kubenswrapper[15202]: I0319 11:01:07.294235 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29d381c7-25dc-4664-b653-142b34e3188d" (UID: "29d381c7-25dc-4664-b653-142b34e3188d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:07.322412 master-0 kubenswrapper[15202]: I0319 11:01:07.322021 15202 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-config-data" (OuterVolumeSpecName: "config-data") pod "29d381c7-25dc-4664-b653-142b34e3188d" (UID: "29d381c7-25dc-4664-b653-142b34e3188d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:07.341954 master-0 kubenswrapper[15202]: I0319 11:01:07.341794 15202 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 11:01:07.341954 master-0 kubenswrapper[15202]: I0319 11:01:07.341884 15202 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m9wn\" (UniqueName: \"kubernetes.io/projected/29d381c7-25dc-4664-b653-142b34e3188d-kube-api-access-5m9wn\") on node \"master-0\" DevicePath \"\"" Mar 19 11:01:07.341954 master-0 kubenswrapper[15202]: I0319 11:01:07.341898 15202 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:01:07.341954 master-0 kubenswrapper[15202]: I0319 11:01:07.341908 15202 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29d381c7-25dc-4664-b653-142b34e3188d-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 11:01:07.727024 master-0 kubenswrapper[15202]: I0319 11:01:07.726945 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-skqb8" event={"ID":"29d381c7-25dc-4664-b653-142b34e3188d","Type":"ContainerDied","Data":"4dcba996c76d0b4ca01cdfc08ea305cc23fe5fdf801bdf5f282bebf1fcefdfd4"} Mar 19 11:01:07.727024 master-0 kubenswrapper[15202]: I0319 11:01:07.727008 15202 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dcba996c76d0b4ca01cdfc08ea305cc23fe5fdf801bdf5f282bebf1fcefdfd4" Mar 19 11:01:07.727971 master-0 kubenswrapper[15202]: I0319 11:01:07.727079 15202 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-skqb8" Mar 19 11:10:35.148191 master-0 kubenswrapper[15202]: I0319 11:10:35.146186 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4zsp/must-gather-d2f7k"] Mar 19 11:10:35.148191 master-0 kubenswrapper[15202]: E0319 11:10:35.146823 15202 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d381c7-25dc-4664-b653-142b34e3188d" containerName="keystone-cron" Mar 19 11:10:35.148191 master-0 kubenswrapper[15202]: I0319 11:10:35.146838 15202 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d381c7-25dc-4664-b653-142b34e3188d" containerName="keystone-cron" Mar 19 11:10:35.148191 master-0 kubenswrapper[15202]: I0319 11:10:35.147158 15202 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d381c7-25dc-4664-b653-142b34e3188d" containerName="keystone-cron" Mar 19 11:10:35.160491 master-0 kubenswrapper[15202]: I0319 11:10:35.159215 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.164528 master-0 kubenswrapper[15202]: I0319 11:10:35.163664 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p4zsp"/"kube-root-ca.crt" Mar 19 11:10:35.164528 master-0 kubenswrapper[15202]: I0319 11:10:35.163918 15202 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p4zsp"/"openshift-service-ca.crt" Mar 19 11:10:35.175503 master-0 kubenswrapper[15202]: I0319 11:10:35.173928 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4zsp/must-gather-d2f7k"] Mar 19 11:10:35.214505 master-0 kubenswrapper[15202]: I0319 11:10:35.213172 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4zsp/must-gather-zlxvl"] Mar 19 11:10:35.216628 master-0 kubenswrapper[15202]: I0319 11:10:35.215666 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.250460 master-0 kubenswrapper[15202]: I0319 11:10:35.250371 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rbxb\" (UniqueName: \"kubernetes.io/projected/deb6063b-f872-4c7a-bf06-f105b6d9125d-kube-api-access-9rbxb\") pod \"must-gather-d2f7k\" (UID: \"deb6063b-f872-4c7a-bf06-f105b6d9125d\") " pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.250460 master-0 kubenswrapper[15202]: I0319 11:10:35.250491 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/deb6063b-f872-4c7a-bf06-f105b6d9125d-must-gather-output\") pod \"must-gather-d2f7k\" (UID: \"deb6063b-f872-4c7a-bf06-f105b6d9125d\") " pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.263498 master-0 kubenswrapper[15202]: I0319 11:10:35.251145 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c-must-gather-output\") pod \"must-gather-zlxvl\" (UID: \"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c\") " pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.263498 master-0 kubenswrapper[15202]: I0319 11:10:35.251187 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zpgc\" (UniqueName: \"kubernetes.io/projected/8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c-kube-api-access-9zpgc\") pod \"must-gather-zlxvl\" (UID: \"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c\") " pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.358498 master-0 kubenswrapper[15202]: I0319 11:10:35.352161 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/deb6063b-f872-4c7a-bf06-f105b6d9125d-must-gather-output\") pod \"must-gather-d2f7k\" (UID: \"deb6063b-f872-4c7a-bf06-f105b6d9125d\") " pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.358498 master-0 kubenswrapper[15202]: I0319 11:10:35.352248 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c-must-gather-output\") pod \"must-gather-zlxvl\" (UID: \"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c\") " pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.358498 master-0 kubenswrapper[15202]: I0319 11:10:35.352270 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zpgc\" (UniqueName: \"kubernetes.io/projected/8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c-kube-api-access-9zpgc\") pod \"must-gather-zlxvl\" (UID: \"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c\") " pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.358498 master-0 kubenswrapper[15202]: I0319 11:10:35.352393 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rbxb\" (UniqueName: \"kubernetes.io/projected/deb6063b-f872-4c7a-bf06-f105b6d9125d-kube-api-access-9rbxb\") pod \"must-gather-d2f7k\" (UID: \"deb6063b-f872-4c7a-bf06-f105b6d9125d\") " pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.358498 master-0 kubenswrapper[15202]: I0319 11:10:35.353115 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/deb6063b-f872-4c7a-bf06-f105b6d9125d-must-gather-output\") pod \"must-gather-d2f7k\" (UID: \"deb6063b-f872-4c7a-bf06-f105b6d9125d\") " pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.365492 master-0 kubenswrapper[15202]: I0319 11:10:35.360006 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c-must-gather-output\") pod \"must-gather-zlxvl\" (UID: \"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c\") " pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.385489 master-0 kubenswrapper[15202]: I0319 11:10:35.385161 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rbxb\" (UniqueName: \"kubernetes.io/projected/deb6063b-f872-4c7a-bf06-f105b6d9125d-kube-api-access-9rbxb\") pod \"must-gather-d2f7k\" (UID: \"deb6063b-f872-4c7a-bf06-f105b6d9125d\") " pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:35.393496 master-0 kubenswrapper[15202]: I0319 11:10:35.392075 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4zsp/must-gather-zlxvl"] Mar 19 11:10:35.403321 master-0 kubenswrapper[15202]: I0319 11:10:35.394488 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zpgc\" (UniqueName: \"kubernetes.io/projected/8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c-kube-api-access-9zpgc\") pod \"must-gather-zlxvl\" (UID: \"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c\") " pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.523090 master-0 kubenswrapper[15202]: I0319 11:10:35.522645 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/must-gather-zlxvl" Mar 19 11:10:35.625311 master-0 kubenswrapper[15202]: I0319 11:10:35.625213 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/must-gather-d2f7k" Mar 19 11:10:36.104608 master-0 kubenswrapper[15202]: I0319 11:10:36.104032 15202 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:10:36.112277 master-0 kubenswrapper[15202]: I0319 11:10:36.112197 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4zsp/must-gather-zlxvl"] Mar 19 11:10:36.253169 master-0 kubenswrapper[15202]: W0319 11:10:36.252793 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb6063b_f872_4c7a_bf06_f105b6d9125d.slice/crio-45f62425e5466a3d38a4cf9f6749c87a0134e7efeb0cb5dd1823c6fb816b93cf WatchSource:0}: Error finding container 45f62425e5466a3d38a4cf9f6749c87a0134e7efeb0cb5dd1823c6fb816b93cf: Status 404 returned error can't find the container with id 45f62425e5466a3d38a4cf9f6749c87a0134e7efeb0cb5dd1823c6fb816b93cf Mar 19 11:10:36.254062 master-0 kubenswrapper[15202]: I0319 11:10:36.254004 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4zsp/must-gather-d2f7k"] Mar 19 11:10:37.074030 master-0 kubenswrapper[15202]: I0319 11:10:37.073868 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/must-gather-d2f7k" event={"ID":"deb6063b-f872-4c7a-bf06-f105b6d9125d","Type":"ContainerStarted","Data":"45f62425e5466a3d38a4cf9f6749c87a0134e7efeb0cb5dd1823c6fb816b93cf"} Mar 19 11:10:37.077844 master-0 kubenswrapper[15202]: I0319 11:10:37.077797 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/must-gather-zlxvl" event={"ID":"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c","Type":"ContainerStarted","Data":"5a4954cb67749b536f35544087c0c7fffa01ffb696e61fd785c6b23713cf8092"} Mar 19 11:10:39.126352 master-0 kubenswrapper[15202]: I0319 11:10:39.126265 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/must-gather-zlxvl" event={"ID":"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c","Type":"ContainerStarted","Data":"427b98b9a969c2a860c464d6a0c82ebfe1d4fccad0d351e87cb567c03c0434a6"} Mar 19 11:10:39.126352 master-0 kubenswrapper[15202]: I0319 11:10:39.126342 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/must-gather-zlxvl" event={"ID":"8e782ce5-d4ff-430b-a5b9-f8d8fb1f650c","Type":"ContainerStarted","Data":"351d79224ae052eb89f969bd5e5de4ce26ec457df392f6cd53ca8a974e076b98"} Mar 19 11:10:39.174503 master-0 kubenswrapper[15202]: I0319 11:10:39.171801 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p4zsp/must-gather-zlxvl" podStartSLOduration=2.683857797 podStartE2EDuration="4.171773509s" podCreationTimestamp="2026-03-19 11:10:35 +0000 UTC" firstStartedPulling="2026-03-19 11:10:36.103782703 +0000 UTC m=+6353.489197549" lastFinishedPulling="2026-03-19 11:10:37.591698445 +0000 UTC m=+6354.977113261" observedRunningTime="2026-03-19 11:10:39.155158809 +0000 UTC m=+6356.540573625" watchObservedRunningTime="2026-03-19 11:10:39.171773509 +0000 UTC m=+6356.557188325" Mar 19 11:10:42.292781 master-0 kubenswrapper[15202]: I0319 11:10:42.292528 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7d58488df-thkn2_dc9945ac-4041-4120-b504-a173c2bf91bd/cluster-version-operator/0.log" Mar 19 11:10:46.004567 master-0 kubenswrapper[15202]: I0319 11:10:46.002022 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jkh97_5cc321ea-4a7a-440d-a58c-a9d141f87363/controller/0.log" Mar 19 11:10:46.037405 master-0 kubenswrapper[15202]: I0319 11:10:46.037362 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jkh97_5cc321ea-4a7a-440d-a58c-a9d141f87363/kube-rbac-proxy/0.log" Mar 19 11:10:46.067692 master-0 kubenswrapper[15202]: I0319 11:10:46.067610 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/controller/0.log" Mar 19 11:10:46.133815 master-0 kubenswrapper[15202]: E0319 11:10:46.133653 15202 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.32.10:45432->192.168.32.10:35219: read tcp 192.168.32.10:45432->192.168.32.10:35219: read: connection reset by peer Mar 19 11:10:46.551501 master-0 kubenswrapper[15202]: I0319 11:10:46.549340 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-dlgsc_a4c02ef0-564a-4eec-8979-6e4a764bfddc/nmstate-console-plugin/0.log" Mar 19 11:10:46.608870 master-0 kubenswrapper[15202]: I0319 11:10:46.606862 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gns5r_f9e7d1ea-3a5e-460c-8b32-5687e773d19d/nmstate-handler/0.log" Mar 19 11:10:46.633752 master-0 kubenswrapper[15202]: I0319 11:10:46.633680 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cpgt6_e0518d63-cd74-47d9-8d59-bc542409fec0/nmstate-metrics/0.log" Mar 19 11:10:46.680786 master-0 kubenswrapper[15202]: I0319 11:10:46.678746 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cpgt6_e0518d63-cd74-47d9-8d59-bc542409fec0/kube-rbac-proxy/0.log" Mar 19 11:10:46.719869 master-0 kubenswrapper[15202]: I0319 11:10:46.716944 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-h6jnz_628c6cfa-b09e-4a74-a152-20da732dd6db/nmstate-operator/0.log" Mar 19 11:10:46.756089 master-0 kubenswrapper[15202]: I0319 11:10:46.755848 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-5wgm6_4dc72b1a-a76e-4246-be58-5576544be5a8/nmstate-webhook/0.log" Mar 19 11:10:47.391163 master-0 kubenswrapper[15202]: I0319 11:10:47.390968 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/frr/0.log" Mar 19 11:10:47.420389 master-0 kubenswrapper[15202]: I0319 11:10:47.420303 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/reloader/0.log" Mar 19 11:10:47.456526 master-0 kubenswrapper[15202]: I0319 11:10:47.451973 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/frr-metrics/0.log" Mar 19 11:10:47.477502 master-0 kubenswrapper[15202]: I0319 11:10:47.475621 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/kube-rbac-proxy/0.log" Mar 19 11:10:47.500508 master-0 kubenswrapper[15202]: I0319 11:10:47.497657 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/kube-rbac-proxy-frr/0.log" Mar 19 11:10:47.521496 master-0 kubenswrapper[15202]: I0319 11:10:47.517792 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/cp-frr-files/0.log" Mar 19 11:10:47.547503 master-0 kubenswrapper[15202]: I0319 11:10:47.540239 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/cp-reloader/0.log" Mar 19 11:10:47.571513 master-0 kubenswrapper[15202]: I0319 11:10:47.568597 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/cp-metrics/0.log" Mar 19 11:10:47.638462 master-0 kubenswrapper[15202]: I0319 11:10:47.637903 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-sfpc9_fb00ada9-e047-47a7-82b0-44a3a66d6669/frr-k8s-webhook-server/0.log" Mar 19 11:10:47.690501 master-0 kubenswrapper[15202]: I0319 11:10:47.690101 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d7b76b756-hw274_52d0ba98-8db8-45ec-b212-8bec41dac138/manager/0.log" Mar 19 11:10:47.710554 master-0 kubenswrapper[15202]: I0319 11:10:47.709837 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-754b74fdf5-vvbj2_87dc574f-d263-420b-9029-edc87ea6c142/webhook-server/0.log" Mar 19 11:10:48.159213 master-0 kubenswrapper[15202]: I0319 11:10:48.157120 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jkzd2_f2a48cff-3780-4bd2-b12c-e6b77a990d8b/speaker/0.log" Mar 19 11:10:48.163533 master-0 kubenswrapper[15202]: I0319 11:10:48.163334 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jkzd2_f2a48cff-3780-4bd2-b12c-e6b77a990d8b/kube-rbac-proxy/0.log" Mar 19 11:10:50.393493 master-0 kubenswrapper[15202]: I0319 11:10:50.391902 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcdctl/0.log" Mar 19 11:10:51.316831 master-0 kubenswrapper[15202]: I0319 11:10:51.316765 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd/0.log" Mar 19 11:10:51.343498 master-0 kubenswrapper[15202]: I0319 11:10:51.341391 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-metrics/0.log" Mar 19 11:10:51.368629 master-0 kubenswrapper[15202]: I0319 11:10:51.368504 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-readyz/0.log" Mar 19 11:10:51.399160 master-0 kubenswrapper[15202]: I0319 11:10:51.398837 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-rev/0.log" Mar 19 11:10:51.431712 master-0 kubenswrapper[15202]: I0319 11:10:51.431131 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/setup/0.log" Mar 19 11:10:51.448498 master-0 kubenswrapper[15202]: I0319 11:10:51.447913 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-ensure-env-vars/0.log" Mar 19 11:10:51.465950 master-0 kubenswrapper[15202]: I0319 11:10:51.465903 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-resources-copy/0.log" Mar 19 11:10:51.546664 master-0 kubenswrapper[15202]: I0319 11:10:51.546550 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_259aa9cc-51a9-498e-b099-ba4d781801c5/installer/0.log" Mar 19 11:10:51.607278 master-0 kubenswrapper[15202]: I0319 11:10:51.607079 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_7e9b2506-dac6-4a23-b2bf-e3ce77919857/installer/0.log" Mar 19 11:10:51.674651 master-0 kubenswrapper[15202]: I0319 11:10:51.673738 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-69bfd98cf-4dhhm_a00456f4-7f6b-4c56-bcd6-72e0f04b84d6/oauth-openshift/0.log" Mar 19 11:10:53.431965 master-0 kubenswrapper[15202]: I0319 11:10:53.431889 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/must-gather-d2f7k" event={"ID":"deb6063b-f872-4c7a-bf06-f105b6d9125d","Type":"ContainerStarted","Data":"0d0dea7f664f0f485f91ead750fdd2374a9d213591b94e2ba8e32f51f2ddd5c2"} Mar 19 11:10:53.432715 master-0 kubenswrapper[15202]: I0319 11:10:53.431979 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/must-gather-d2f7k" event={"ID":"deb6063b-f872-4c7a-bf06-f105b6d9125d","Type":"ContainerStarted","Data":"dd7fc2ba49d37cd576c289684b4560ec4b55e40e9a546f00a45d0c1a4a4a7c01"} Mar 19 11:10:53.437704 master-0 kubenswrapper[15202]: I0319 11:10:53.437645 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-gn85g_9039b9d3-27c2-4c42-ae8b-28e40570b3c2/assisted-installer-controller/0.log" Mar 19 11:10:53.465049 master-0 kubenswrapper[15202]: I0319 11:10:53.464894 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p4zsp/must-gather-d2f7k" podStartSLOduration=2.572994134 podStartE2EDuration="18.464863377s" podCreationTimestamp="2026-03-19 11:10:35 +0000 UTC" firstStartedPulling="2026-03-19 11:10:36.272134111 +0000 UTC m=+6353.657548927" lastFinishedPulling="2026-03-19 11:10:52.164003354 +0000 UTC m=+6369.549418170" observedRunningTime="2026-03-19 11:10:53.453232651 +0000 UTC m=+6370.838647467" watchObservedRunningTime="2026-03-19 11:10:53.464863377 +0000 UTC m=+6370.850278193" Mar 19 11:10:53.755498 master-0 kubenswrapper[15202]: I0319 11:10:53.755098 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-z8gbk_357980ba-1957-412f-afb5-04281eca2bee/authentication-operator/0.log" Mar 19 11:10:53.840519 master-0 kubenswrapper[15202]: I0319 11:10:53.839838 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-z8gbk_357980ba-1957-412f-afb5-04281eca2bee/authentication-operator/1.log" Mar 19 11:10:55.072451 master-0 kubenswrapper[15202]: I0319 11:10:55.071531 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dcf5569b5-4cst9_dff4eb24-47ac-46be-bf3d-d939bd739b52/router/0.log" Mar 19 11:10:55.735391 master-0 kubenswrapper[15202]: I0319 11:10:55.735300 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd"] Mar 19 11:10:55.738440 master-0 kubenswrapper[15202]: I0319 11:10:55.738374 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.752618 master-0 kubenswrapper[15202]: I0319 11:10:55.752549 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd"] Mar 19 11:10:55.856113 master-0 kubenswrapper[15202]: I0319 11:10:55.856016 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-proc\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.856507 master-0 kubenswrapper[15202]: I0319 11:10:55.856249 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-podres\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.856507 master-0 kubenswrapper[15202]: I0319 11:10:55.856348 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-sys\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.856507 master-0 kubenswrapper[15202]: I0319 11:10:55.856409 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72cz\" (UniqueName: \"kubernetes.io/projected/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-kube-api-access-f72cz\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.856674 master-0 kubenswrapper[15202]: I0319 11:10:55.856525 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-lib-modules\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962225 master-0 kubenswrapper[15202]: I0319 11:10:55.962139 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72cz\" (UniqueName: \"kubernetes.io/projected/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-kube-api-access-f72cz\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962586 master-0 kubenswrapper[15202]: I0319 11:10:55.962280 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-lib-modules\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962586 master-0 kubenswrapper[15202]: I0319 11:10:55.962372 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-proc\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962586 master-0 kubenswrapper[15202]: I0319 11:10:55.962433 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-podres\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962586 master-0 kubenswrapper[15202]: I0319 11:10:55.962510 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-sys\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962805 master-0 kubenswrapper[15202]: I0319 11:10:55.962740 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-proc\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.962879 master-0 kubenswrapper[15202]: I0319 11:10:55.962854 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-sys\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.963354 master-0 kubenswrapper[15202]: I0319 11:10:55.963311 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-lib-modules\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.963432 master-0 kubenswrapper[15202]: I0319 11:10:55.963405 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-podres\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:55.981606 master-0 kubenswrapper[15202]: I0319 11:10:55.981420 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72cz\" (UniqueName: \"kubernetes.io/projected/f91f1c60-21bf-40bc-bb3c-427cc099f7c8-kube-api-access-f72cz\") pod \"perf-node-gather-daemonset-wr8nd\" (UID: \"f91f1c60-21bf-40bc-bb3c-427cc099f7c8\") " pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:56.111379 master-0 kubenswrapper[15202]: I0319 11:10:56.111227 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-6fccf84fc5-rnmt2_b2bff8a5-c45d-4d28-8771-2239ad0fa578/oauth-apiserver/0.log" Mar 19 11:10:56.116867 master-0 kubenswrapper[15202]: I0319 11:10:56.116725 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:56.136896 master-0 kubenswrapper[15202]: I0319 11:10:56.136835 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-6fccf84fc5-rnmt2_b2bff8a5-c45d-4d28-8771-2239ad0fa578/fix-audit-permissions/0.log" Mar 19 11:10:56.748434 master-0 kubenswrapper[15202]: I0319 11:10:56.748331 15202 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd"] Mar 19 11:10:56.752942 master-0 kubenswrapper[15202]: W0319 11:10:56.752546 15202 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf91f1c60_21bf_40bc_bb3c_427cc099f7c8.slice/crio-4ca5da06c08ade3614ea913664a290b2bff8a2382a04ec6d61562f857382db71 WatchSource:0}: Error finding container 4ca5da06c08ade3614ea913664a290b2bff8a2382a04ec6d61562f857382db71: Status 404 returned error can't find the container with id 4ca5da06c08ade3614ea913664a290b2bff8a2382a04ec6d61562f857382db71 Mar 19 11:10:57.199831 master-0 kubenswrapper[15202]: I0319 11:10:57.199728 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/kube-rbac-proxy/0.log" Mar 19 11:10:57.515823 master-0 kubenswrapper[15202]: I0319 11:10:57.490850 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" event={"ID":"f91f1c60-21bf-40bc-bb3c-427cc099f7c8","Type":"ContainerStarted","Data":"4ca5da06c08ade3614ea913664a290b2bff8a2382a04ec6d61562f857382db71"} Mar 19 11:10:57.516122 master-0 kubenswrapper[15202]: I0319 11:10:57.515807 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/2.log" Mar 19 11:10:57.603771 master-0 kubenswrapper[15202]: I0319 11:10:57.603613 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-hzrg4_d32541c9-eef6-417c-9f5a-a7392dc70aa0/cluster-autoscaler-operator/3.log" Mar 19 11:10:57.625538 master-0 kubenswrapper[15202]: I0319 11:10:57.625388 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/4.log" Mar 19 11:10:57.628515 master-0 kubenswrapper[15202]: I0319 11:10:57.628381 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/cluster-baremetal-operator/5.log" Mar 19 11:10:57.646796 master-0 kubenswrapper[15202]: I0319 11:10:57.646371 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-nm9nx_cd42096c-f18d-4bb5-8a51-8761dc1edb73/baremetal-kube-rbac-proxy/0.log" Mar 19 11:10:57.667427 master-0 kubenswrapper[15202]: I0319 11:10:57.666839 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-l8kmn_d486ce23-acf7-429a-9739-4770e1a2bf78/control-plane-machine-set-operator/1.log" Mar 19 11:10:57.667427 master-0 kubenswrapper[15202]: I0319 11:10:57.667036 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-l8kmn_d486ce23-acf7-429a-9739-4770e1a2bf78/control-plane-machine-set-operator/0.log" Mar 19 11:10:57.956627 master-0 kubenswrapper[15202]: I0319 11:10:57.950855 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_ironic-proxy-kc5xl_edb10ae4-b456-4b8f-8ed0-95b53ba1bdf1/ironic-proxy/0.log" Mar 19 11:10:57.991510 master-0 kubenswrapper[15202]: I0319 11:10:57.988918 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-qx75g_f93b8728-4a33-4ee4-b7c6-cff7d7995953/kube-rbac-proxy/0.log" Mar 19 11:10:58.009351 master-0 kubenswrapper[15202]: I0319 11:10:58.009279 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-qx75g_f93b8728-4a33-4ee4-b7c6-cff7d7995953/machine-api-operator/1.log" Mar 19 11:10:58.010851 master-0 kubenswrapper[15202]: I0319 11:10:58.010823 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-qx75g_f93b8728-4a33-4ee4-b7c6-cff7d7995953/machine-api-operator/2.log" Mar 19 11:10:58.507946 master-0 kubenswrapper[15202]: I0319 11:10:58.507879 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" event={"ID":"f91f1c60-21bf-40bc-bb3c-427cc099f7c8","Type":"ContainerStarted","Data":"b4057e02db7503d3fca9960d9eaaf0e1e7240589cedc124f071d018329dd9caf"} Mar 19 11:10:58.508567 master-0 kubenswrapper[15202]: I0319 11:10:58.507993 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:10:58.528550 master-0 kubenswrapper[15202]: I0319 11:10:58.528285 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" podStartSLOduration=3.528266523 podStartE2EDuration="3.528266523s" podCreationTimestamp="2026-03-19 11:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:10:58.526298754 +0000 UTC m=+6375.911713590" watchObservedRunningTime="2026-03-19 11:10:58.528266523 +0000 UTC m=+6375.913681339" Mar 19 11:11:00.544774 master-0 kubenswrapper[15202]: I0319 11:11:00.544675 15202 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4zsp/master-0-debug-2l4jg"] Mar 19 11:11:00.546674 master-0 kubenswrapper[15202]: I0319 11:11:00.546642 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.596698 master-0 kubenswrapper[15202]: I0319 11:11:00.596609 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16475c20-12f3-450c-a1b3-b763e7940e47-host\") pod \"master-0-debug-2l4jg\" (UID: \"16475c20-12f3-450c-a1b3-b763e7940e47\") " pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.596965 master-0 kubenswrapper[15202]: I0319 11:11:00.596715 15202 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgl84\" (UniqueName: \"kubernetes.io/projected/16475c20-12f3-450c-a1b3-b763e7940e47-kube-api-access-wgl84\") pod \"master-0-debug-2l4jg\" (UID: \"16475c20-12f3-450c-a1b3-b763e7940e47\") " pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.699278 master-0 kubenswrapper[15202]: I0319 11:11:00.699002 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16475c20-12f3-450c-a1b3-b763e7940e47-host\") pod \"master-0-debug-2l4jg\" (UID: \"16475c20-12f3-450c-a1b3-b763e7940e47\") " pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.699278 master-0 kubenswrapper[15202]: I0319 11:11:00.699194 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16475c20-12f3-450c-a1b3-b763e7940e47-host\") pod \"master-0-debug-2l4jg\" (UID: \"16475c20-12f3-450c-a1b3-b763e7940e47\") " pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.699278 master-0 kubenswrapper[15202]: I0319 11:11:00.699215 15202 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgl84\" (UniqueName: \"kubernetes.io/projected/16475c20-12f3-450c-a1b3-b763e7940e47-kube-api-access-wgl84\") pod \"master-0-debug-2l4jg\" (UID: \"16475c20-12f3-450c-a1b3-b763e7940e47\") " pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.716961 master-0 kubenswrapper[15202]: I0319 11:11:00.716903 15202 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgl84\" (UniqueName: \"kubernetes.io/projected/16475c20-12f3-450c-a1b3-b763e7940e47-kube-api-access-wgl84\") pod \"master-0-debug-2l4jg\" (UID: \"16475c20-12f3-450c-a1b3-b763e7940e47\") " pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:00.888113 master-0 kubenswrapper[15202]: I0319 11:11:00.887954 15202 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" Mar 19 11:11:01.573272 master-0 kubenswrapper[15202]: I0319 11:11:01.573152 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" event={"ID":"16475c20-12f3-450c-a1b3-b763e7940e47","Type":"ContainerStarted","Data":"4f056942409b3cf72062ffe108e8c625ee192a9e8a586cb22180bfe99902bf3d"} Mar 19 11:11:01.971413 master-0 kubenswrapper[15202]: I0319 11:11:01.968094 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-546c754db-8r9wh_90e6a8d7-86b3-4082-a6f7-4d1001e48563/metal3-httpd/0.log" Mar 19 11:11:02.639954 master-0 kubenswrapper[15202]: I0319 11:11:02.639889 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-546c754db-8r9wh_90e6a8d7-86b3-4082-a6f7-4d1001e48563/metal3-ironic/0.log" Mar 19 11:11:02.658449 master-0 kubenswrapper[15202]: I0319 11:11:02.657236 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-546c754db-8r9wh_90e6a8d7-86b3-4082-a6f7-4d1001e48563/metal3-ramdisk-logs/0.log" Mar 19 11:11:02.694688 master-0 kubenswrapper[15202]: I0319 11:11:02.694630 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-546c754db-8r9wh_90e6a8d7-86b3-4082-a6f7-4d1001e48563/machine-os-images/0.log" Mar 19 11:11:02.913054 master-0 kubenswrapper[15202]: I0319 11:11:02.912930 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-api-0_29cc00b0-1537-42ce-b8ce-918dea958cf9/cinder-7ba05-api-log/0.log" Mar 19 11:11:03.150626 master-0 kubenswrapper[15202]: I0319 11:11:03.150567 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-api-0_29cc00b0-1537-42ce-b8ce-918dea958cf9/cinder-api/0.log" Mar 19 11:11:03.362436 master-0 kubenswrapper[15202]: I0319 11:11:03.362361 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-backup-0_eca23393-b469-4d29-bb25-0b1edae5d066/cinder-backup/0.log" Mar 19 11:11:03.477501 master-0 kubenswrapper[15202]: I0319 11:11:03.475780 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-baremetal-operator-78474bdc48-lpxgr_bf7f4c14-046a-493e-b6b8-1e821c66b504/metal3-baremetal-operator/0.log" Mar 19 11:11:03.500271 master-0 kubenswrapper[15202]: I0319 11:11:03.500191 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-image-customization-controller/0.log" Mar 19 11:11:03.512813 master-0 kubenswrapper[15202]: I0319 11:11:03.510763 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-backup-0_eca23393-b469-4d29-bb25-0b1edae5d066/probe/0.log" Mar 19 11:11:03.520915 master-0 kubenswrapper[15202]: I0319 11:11:03.520627 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7b5d8dfcfd-gjzrj_127c2b98-5be4-46f3-95d6-1901fab637ff/machine-os-images/2.log" Mar 19 11:11:03.627567 master-0 kubenswrapper[15202]: I0319 11:11:03.627105 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-scheduler-0_45f14454-38fd-4f69-81b6-8d66033b21d4/cinder-scheduler/0.log" Mar 19 11:11:03.713439 master-0 kubenswrapper[15202]: I0319 11:11:03.713369 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-scheduler-0_45f14454-38fd-4f69-81b6-8d66033b21d4/probe/0.log" Mar 19 11:11:03.829991 master-0 kubenswrapper[15202]: I0319 11:11:03.829910 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-volume-lvm-iscsi-0_c8a687b2-1e3a-4510-af7c-34277b366455/cinder-volume/0.log" Mar 19 11:11:03.917658 master-0 kubenswrapper[15202]: I0319 11:11:03.917497 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-7ba05-volume-lvm-iscsi-0_c8a687b2-1e3a-4510-af7c-34277b366455/probe/0.log" Mar 19 11:11:03.950749 master-0 kubenswrapper[15202]: I0319 11:11:03.950634 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5687765f45-jhnth_eef5b369-ca6a-4da9-a54b-4d2cf46e4328/dnsmasq-dns/0.log" Mar 19 11:11:03.962386 master-0 kubenswrapper[15202]: I0319 11:11:03.962290 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5687765f45-jhnth_eef5b369-ca6a-4da9-a54b-4d2cf46e4328/init/0.log" Mar 19 11:11:04.107383 master-0 kubenswrapper[15202]: I0319 11:11:04.107298 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5_9bf992bb-2aac-49c3-8135-6ab9f3a53193/osp-httpd/0.log" Mar 19 11:11:04.119170 master-0 kubenswrapper[15202]: I0319 11:11:04.118929 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_edpm-a-provisionserver-openstackprovisionserver-7544578cbc568v5_9bf992bb-2aac-49c3-8135-6ab9f3a53193/init/0.log" Mar 19 11:11:04.272390 master-0 kubenswrapper[15202]: I0319 11:11:04.272314 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm_e2d72e05-7738-45d4-8b7a-2bfdb439e7f5/osp-httpd/0.log" Mar 19 11:11:04.281258 master-0 kubenswrapper[15202]: I0319 11:11:04.281198 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_edpm-b-provisionserver-openstackprovisionserver-5dcffdb788cr7nm_e2d72e05-7738-45d4-8b7a-2bfdb439e7f5/init/0.log" Mar 19 11:11:04.422627 master-0 kubenswrapper[15202]: I0319 11:11:04.422044 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-3a5fd-default-external-api-0_c8e50d67-c919-4e31-a98d-882b87a58541/glance-log/0.log" Mar 19 11:11:04.458352 master-0 kubenswrapper[15202]: I0319 11:11:04.458238 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-3a5fd-default-external-api-0_c8e50d67-c919-4e31-a98d-882b87a58541/glance-httpd/0.log" Mar 19 11:11:04.594695 master-0 kubenswrapper[15202]: I0319 11:11:04.594340 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-3a5fd-default-internal-api-0_fca47216-8f0d-4d96-b557-0f35c442eccb/glance-log/0.log" Mar 19 11:11:04.635302 master-0 kubenswrapper[15202]: I0319 11:11:04.635192 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-3a5fd-default-internal-api-0_fca47216-8f0d-4d96-b557-0f35c442eccb/glance-httpd/0.log" Mar 19 11:11:04.696424 master-0 kubenswrapper[15202]: I0319 11:11:04.696366 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6b44d66bc9-5zxbb_c79cbfca-37c7-4d97-87a9-6da6333a6302/keystone-api/0.log" Mar 19 11:11:04.716324 master-0 kubenswrapper[15202]: I0319 11:11:04.714622 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565241-vpcdg_aa3b5bce-9cb9-47f1-9b40-238df5b3a007/keystone-cron/0.log" Mar 19 11:11:04.738266 master-0 kubenswrapper[15202]: I0319 11:11:04.736079 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565301-skqb8_29d381c7-25dc-4664-b653-142b34e3188d/keystone-cron/0.log" Mar 19 11:11:05.374788 master-0 kubenswrapper[15202]: I0319 11:11:05.374696 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/cluster-cloud-controller-manager/0.log" Mar 19 11:11:05.378865 master-0 kubenswrapper[15202]: I0319 11:11:05.378819 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/cluster-cloud-controller-manager/1.log" Mar 19 11:11:05.406060 master-0 kubenswrapper[15202]: I0319 11:11:05.405966 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/config-sync-controllers/0.log" Mar 19 11:11:05.439561 master-0 kubenswrapper[15202]: I0319 11:11:05.439394 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/config-sync-controllers/1.log" Mar 19 11:11:05.463025 master-0 kubenswrapper[15202]: I0319 11:11:05.462893 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-rz5nt_a4149b83-964c-4bd2-9769-44c7b9da0a52/kube-rbac-proxy/0.log" Mar 19 11:11:06.219613 master-0 kubenswrapper[15202]: I0319 11:11:06.215447 15202 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p4zsp/perf-node-gather-daemonset-wr8nd" Mar 19 11:11:08.688851 master-0 kubenswrapper[15202]: I0319 11:11:08.688775 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-s7ts2_c2a16f6f-437c-4da5-a797-287e5e1ddbd4/kube-rbac-proxy/0.log" Mar 19 11:11:08.750814 master-0 kubenswrapper[15202]: I0319 11:11:08.750669 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-s7ts2_c2a16f6f-437c-4da5-a797-287e5e1ddbd4/cloud-credential-operator/1.log" Mar 19 11:11:10.144392 master-0 kubenswrapper[15202]: I0319 11:11:10.143378 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d9d44135-dd46-4cc3-aa4f-21c5b9d1604c/memcached/0.log" Mar 19 11:11:10.275607 master-0 kubenswrapper[15202]: I0319 11:11:10.275204 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77db675565-g4zz2_1c78c47a-7a9a-4835-92ed-a3da198b2cc8/neutron-api/0.log" Mar 19 11:11:10.309284 master-0 kubenswrapper[15202]: I0319 11:11:10.308796 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77db675565-g4zz2_1c78c47a-7a9a-4835-92ed-a3da198b2cc8/neutron-httpd/0.log" Mar 19 11:11:10.479340 master-0 kubenswrapper[15202]: I0319 11:11:10.478843 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_63107e92-79e8-45d6-af32-05d718986530/nova-api-log/0.log" Mar 19 11:11:11.066719 master-0 kubenswrapper[15202]: I0319 11:11:11.041403 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_63107e92-79e8-45d6-af32-05d718986530/nova-api-api/0.log" Mar 19 11:11:11.176527 master-0 kubenswrapper[15202]: I0319 11:11:11.171044 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_801f6df2-3122-47bf-839d-bc6b737aa320/nova-cell0-conductor-conductor/0.log" Mar 19 11:11:11.277568 master-0 kubenswrapper[15202]: I0319 11:11:11.277040 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8816b584-8f92-4ac4-93f3-fcd86f5e64a2/nova-cell1-conductor-conductor/0.log" Mar 19 11:11:11.387837 master-0 kubenswrapper[15202]: I0319 11:11:11.387759 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2ca1bcf5-5b69-4fac-91c0-af03f6f99980/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 11:11:11.472592 master-0 kubenswrapper[15202]: I0319 11:11:11.471645 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_17791259-f6c3-44a3-9ee1-f87b6c7db780/nova-metadata-log/0.log" Mar 19 11:11:11.833817 master-0 kubenswrapper[15202]: I0319 11:11:11.833653 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-bqqqq_7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/openshift-config-operator/0.log" Mar 19 11:11:11.836722 master-0 kubenswrapper[15202]: I0319 11:11:11.836687 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-bqqqq_7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/openshift-config-operator/1.log" Mar 19 11:11:11.856821 master-0 kubenswrapper[15202]: I0319 11:11:11.856767 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-bqqqq_7fe9dd78-a067-4b36-8a6b-5f20a5f0abe8/openshift-api/0.log" Mar 19 11:11:12.156299 master-0 kubenswrapper[15202]: I0319 11:11:12.156164 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_17791259-f6c3-44a3-9ee1-f87b6c7db780/nova-metadata-metadata/0.log" Mar 19 11:11:12.298130 master-0 kubenswrapper[15202]: I0319 11:11:12.298037 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_661ff0ec-2637-474a-b47e-658ac7e62908/nova-scheduler-scheduler/0.log" Mar 19 11:11:12.322135 master-0 kubenswrapper[15202]: I0319 11:11:12.322076 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a491330a-0016-4f3a-b003-bb80733aaaab/galera/0.log" Mar 19 11:11:12.341383 master-0 kubenswrapper[15202]: I0319 11:11:12.341311 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a491330a-0016-4f3a-b003-bb80733aaaab/mysql-bootstrap/0.log" Mar 19 11:11:12.372033 master-0 kubenswrapper[15202]: I0319 11:11:12.371970 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_534e2f72-f4ac-40f2-8dad-a1100e7c67b1/galera/0.log" Mar 19 11:11:12.403861 master-0 kubenswrapper[15202]: I0319 11:11:12.403068 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_534e2f72-f4ac-40f2-8dad-a1100e7c67b1/mysql-bootstrap/0.log" Mar 19 11:11:12.434642 master-0 kubenswrapper[15202]: I0319 11:11:12.434590 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7eb8479a-645c-40f7-852f-1b0fb72fa067/openstackclient/0.log" Mar 19 11:11:12.455700 master-0 kubenswrapper[15202]: I0319 11:11:12.455641 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m68fw_2ed2b7a9-27a6-43ac-ba84-ae1a7d670160/ovn-controller/0.log" Mar 19 11:11:12.474385 master-0 kubenswrapper[15202]: I0319 11:11:12.474336 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7dlz8_4ccdb24a-f249-4ca8-9f50-769bac7da7f0/openstack-network-exporter/0.log" Mar 19 11:11:12.487775 master-0 kubenswrapper[15202]: I0319 11:11:12.487713 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sl66q_5d4e5d5b-673c-4292-8b11-b58920594cf5/ovsdb-server/0.log" Mar 19 11:11:12.498950 master-0 kubenswrapper[15202]: I0319 11:11:12.498811 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sl66q_5d4e5d5b-673c-4292-8b11-b58920594cf5/ovs-vswitchd/0.log" Mar 19 11:11:12.506674 master-0 kubenswrapper[15202]: I0319 11:11:12.506631 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sl66q_5d4e5d5b-673c-4292-8b11-b58920594cf5/ovsdb-server-init/0.log" Mar 19 11:11:12.530497 master-0 kubenswrapper[15202]: I0319 11:11:12.527826 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b3afa041-e3b0-469b-810e-ce69f3a88264/ovn-northd/0.log" Mar 19 11:11:12.540494 master-0 kubenswrapper[15202]: I0319 11:11:12.536006 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b3afa041-e3b0-469b-810e-ce69f3a88264/openstack-network-exporter/0.log" Mar 19 11:11:12.562499 master-0 kubenswrapper[15202]: I0319 11:11:12.559893 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_756b7d24-4b2a-48d2-b574-c0a2f3f9a411/ovsdbserver-nb/0.log" Mar 19 11:11:12.573540 master-0 kubenswrapper[15202]: I0319 11:11:12.569547 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_756b7d24-4b2a-48d2-b574-c0a2f3f9a411/openstack-network-exporter/0.log" Mar 19 11:11:12.588316 master-0 kubenswrapper[15202]: I0319 11:11:12.587920 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5739f34d-56e3-4305-8f93-bf6d6636f5e6/ovsdbserver-sb/0.log" Mar 19 11:11:12.598163 master-0 kubenswrapper[15202]: I0319 11:11:12.597777 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5739f34d-56e3-4305-8f93-bf6d6636f5e6/openstack-network-exporter/0.log" Mar 19 11:11:12.683093 master-0 kubenswrapper[15202]: I0319 11:11:12.681055 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67c9b9475d-ksb2w_7328616d-ec33-44bc-a0ca-aad7c3ca650e/placement-log/0.log" Mar 19 11:11:12.739429 master-0 kubenswrapper[15202]: I0319 11:11:12.739294 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67c9b9475d-ksb2w_7328616d-ec33-44bc-a0ca-aad7c3ca650e/placement-api/0.log" Mar 19 11:11:12.796553 master-0 kubenswrapper[15202]: I0319 11:11:12.796198 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9bab9d65-06f1-4b08-aa8c-5f12e7d06183/rabbitmq/0.log" Mar 19 11:11:12.824178 master-0 kubenswrapper[15202]: I0319 11:11:12.823720 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9bab9d65-06f1-4b08-aa8c-5f12e7d06183/setup-container/0.log" Mar 19 11:11:12.869693 master-0 kubenswrapper[15202]: I0319 11:11:12.869610 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_67fbe9e8-1121-4091-954c-c6a620d98528/rabbitmq/0.log" Mar 19 11:11:12.877920 master-0 kubenswrapper[15202]: I0319 11:11:12.877635 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_67fbe9e8-1121-4091-954c-c6a620d98528/setup-container/0.log" Mar 19 11:11:13.060645 master-0 kubenswrapper[15202]: I0319 11:11:13.060375 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-77dc968fc8-nnkkj_180cd549-4f02-4a40-875d-5d44423f0b2f/proxy-httpd/0.log" Mar 19 11:11:13.080612 master-0 kubenswrapper[15202]: I0319 11:11:13.080538 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-77dc968fc8-nnkkj_180cd549-4f02-4a40-875d-5d44423f0b2f/proxy-server/0.log" Mar 19 11:11:13.091550 master-0 kubenswrapper[15202]: I0319 11:11:13.091498 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-l8hw9_e8372380-9188-4c0f-9e75-05739d26a27c/swift-ring-rebalance/0.log" Mar 19 11:11:13.121226 master-0 kubenswrapper[15202]: I0319 11:11:13.121165 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/account-server/0.log" Mar 19 11:11:13.163865 master-0 kubenswrapper[15202]: I0319 11:11:13.163783 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/account-replicator/0.log" Mar 19 11:11:13.173122 master-0 kubenswrapper[15202]: I0319 11:11:13.173033 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/account-auditor/0.log" Mar 19 11:11:13.184636 master-0 kubenswrapper[15202]: I0319 11:11:13.184575 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/account-reaper/0.log" Mar 19 11:11:13.198117 master-0 kubenswrapper[15202]: I0319 11:11:13.198048 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/container-server/0.log" Mar 19 11:11:13.253285 master-0 kubenswrapper[15202]: I0319 11:11:13.253221 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/container-replicator/0.log" Mar 19 11:11:13.261574 master-0 kubenswrapper[15202]: I0319 11:11:13.261439 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/container-auditor/0.log" Mar 19 11:11:13.278385 master-0 kubenswrapper[15202]: I0319 11:11:13.278291 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/container-updater/0.log" Mar 19 11:11:13.287989 master-0 kubenswrapper[15202]: I0319 11:11:13.287923 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/object-server/0.log" Mar 19 11:11:13.327866 master-0 kubenswrapper[15202]: I0319 11:11:13.327708 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/object-replicator/0.log" Mar 19 11:11:13.361936 master-0 kubenswrapper[15202]: I0319 11:11:13.361827 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/object-auditor/0.log" Mar 19 11:11:13.371604 master-0 kubenswrapper[15202]: I0319 11:11:13.371508 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/object-updater/0.log" Mar 19 11:11:13.383435 master-0 kubenswrapper[15202]: I0319 11:11:13.383339 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/object-expirer/0.log" Mar 19 11:11:13.394891 master-0 kubenswrapper[15202]: I0319 11:11:13.394719 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/rsync/0.log" Mar 19 11:11:13.412348 master-0 kubenswrapper[15202]: I0319 11:11:13.412304 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d9c99748-0ca1-4e25-947a-801c2e8748f5/swift-recon-cron/0.log" Mar 19 11:11:13.463929 master-0 kubenswrapper[15202]: I0319 11:11:13.463696 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/3.log" Mar 19 11:11:13.509563 master-0 kubenswrapper[15202]: I0319 11:11:13.506907 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b6568d85-grltt_269465d8-91d6-40d7-bfde-3eff9b93c1cf/console-operator/4.log" Mar 19 11:11:14.707500 master-0 kubenswrapper[15202]: I0319 11:11:14.705559 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdb5b65cd-fdkqt_64b2a748-e1c7-458d-9287-ab369cd3f056/console/0.log" Mar 19 11:11:14.803220 master-0 kubenswrapper[15202]: I0319 11:11:14.803152 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-66b8ffb895-7n68q_1dc7476c-75a8-40fe-93f7-fca31aa2ebda/download-server/0.log" Mar 19 11:11:17.083184 master-0 kubenswrapper[15202]: I0319 11:11:17.083112 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-7d87854d6-g96tv_31742478-0d83-48cf-b38b-02416d95d4a8/cluster-storage-operator/1.log" Mar 19 11:11:17.108492 master-0 kubenswrapper[15202]: I0319 11:11:17.108420 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/2.log" Mar 19 11:11:17.109158 master-0 kubenswrapper[15202]: I0319 11:11:17.109104 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-dzfgb_e3376275-294d-446d-9b4c-930df60dba01/snapshot-controller/3.log" Mar 19 11:11:17.134591 master-0 kubenswrapper[15202]: I0319 11:11:17.134529 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5f5d689c6b-dspnb_e09725c2-45c6-4a60-b817-6e5316d6f8e8/csi-snapshot-controller-operator/0.log" Mar 19 11:11:18.162137 master-0 kubenswrapper[15202]: I0319 11:11:18.161997 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-fdxtp_ece5177b-ae15-4c33-a8d4-612ab50b2b8b/dns-operator/0.log" Mar 19 11:11:18.219874 master-0 kubenswrapper[15202]: I0319 11:11:18.219739 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-9c5679d8f-fdxtp_ece5177b-ae15-4c33-a8d4-612ab50b2b8b/kube-rbac-proxy/0.log" Mar 19 11:11:19.234564 master-0 kubenswrapper[15202]: I0319 11:11:19.234107 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p88qq_b8f39c16-3a94-45c3-a51c-f2e81eff967d/dns/0.log" Mar 19 11:11:19.257870 master-0 kubenswrapper[15202]: I0319 11:11:19.257726 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p88qq_b8f39c16-3a94-45c3-a51c-f2e81eff967d/kube-rbac-proxy/0.log" Mar 19 11:11:19.294148 master-0 kubenswrapper[15202]: I0319 11:11:19.292603 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pmxm8_d52fa1ad-0071-4506-bb94-e73d6f15a75c/dns-node-resolver/0.log" Mar 19 11:11:20.688210 master-0 kubenswrapper[15202]: I0319 11:11:20.688146 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-ct498_9663cc40-a69d-42ba-890e-071cb85062f5/etcd-operator/1.log" Mar 19 11:11:20.700756 master-0 kubenswrapper[15202]: I0319 11:11:20.700689 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-ct498_9663cc40-a69d-42ba-890e-071cb85062f5/etcd-operator/0.log" Mar 19 11:11:21.649837 master-0 kubenswrapper[15202]: I0319 11:11:21.649684 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcdctl/0.log" Mar 19 11:11:21.929702 master-0 kubenswrapper[15202]: I0319 11:11:21.926917 15202 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" event={"ID":"16475c20-12f3-450c-a1b3-b763e7940e47","Type":"ContainerStarted","Data":"b44aee98ab27efe5da332c44e1aec1d68a5601ce41b5e7bef0ed6347dce830d3"} Mar 19 11:11:21.970116 master-0 kubenswrapper[15202]: I0319 11:11:21.967692 15202 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p4zsp/master-0-debug-2l4jg" podStartSLOduration=2.00902487 podStartE2EDuration="21.967449343s" podCreationTimestamp="2026-03-19 11:11:00 +0000 UTC" firstStartedPulling="2026-03-19 11:11:00.939567268 +0000 UTC m=+6378.324982084" lastFinishedPulling="2026-03-19 11:11:20.897991741 +0000 UTC m=+6398.283406557" observedRunningTime="2026-03-19 11:11:21.958588025 +0000 UTC m=+6399.344002841" watchObservedRunningTime="2026-03-19 11:11:21.967449343 +0000 UTC m=+6399.352864159" Mar 19 11:11:22.515286 master-0 kubenswrapper[15202]: I0319 11:11:22.515209 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd/0.log" Mar 19 11:11:22.539559 master-0 kubenswrapper[15202]: I0319 11:11:22.539086 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-metrics/0.log" Mar 19 11:11:22.560395 master-0 kubenswrapper[15202]: I0319 11:11:22.559943 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-readyz/0.log" Mar 19 11:11:22.578606 master-0 kubenswrapper[15202]: I0319 11:11:22.578543 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-rev/0.log" Mar 19 11:11:22.590919 master-0 kubenswrapper[15202]: I0319 11:11:22.590864 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/setup/0.log" Mar 19 11:11:22.607751 master-0 kubenswrapper[15202]: I0319 11:11:22.607700 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-ensure-env-vars/0.log" Mar 19 11:11:22.652278 master-0 kubenswrapper[15202]: I0319 11:11:22.652218 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_094204df314fe45bd5af12ca1b4622bb/etcd-resources-copy/0.log" Mar 19 11:11:22.710040 master-0 kubenswrapper[15202]: I0319 11:11:22.706125 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_259aa9cc-51a9-498e-b099-ba4d781801c5/installer/0.log" Mar 19 11:11:22.752144 master-0 kubenswrapper[15202]: I0319 11:11:22.752075 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_7e9b2506-dac6-4a23-b2bf-e3ce77919857/installer/0.log" Mar 19 11:11:23.796303 master-0 kubenswrapper[15202]: I0319 11:11:23.796103 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-5549dc66cb-dcmsc_a417fe25-4aca-471c-941d-c195b6141042/cluster-image-registry-operator/0.log" Mar 19 11:11:23.816186 master-0 kubenswrapper[15202]: I0319 11:11:23.816129 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qd25m_2c72041e-60f6-43b1-b435-16874b591bd4/node-ca/0.log" Mar 19 11:11:24.888307 master-0 kubenswrapper[15202]: I0319 11:11:24.887738 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/1.log" Mar 19 11:11:24.891196 master-0 kubenswrapper[15202]: I0319 11:11:24.891151 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/ingress-operator/2.log" Mar 19 11:11:24.906324 master-0 kubenswrapper[15202]: I0319 11:11:24.906272 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-pgdrx_6a8e2194-aba6-4929-a29c-47c63c8ff799/kube-rbac-proxy/0.log" Mar 19 11:11:25.739671 master-0 kubenswrapper[15202]: I0319 11:11:25.739611 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gmjrw_5ce57500-da52-4d24-8fa6-868dae9a6932/serve-healthcheck-canary/0.log" Mar 19 11:11:26.430496 master-0 kubenswrapper[15202]: I0319 11:11:26.430166 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-68bf6ff9d6-wshz8_0cb70a30-a8d1-4037-81e6-eb4f0510a234/insights-operator/3.log" Mar 19 11:11:26.471620 master-0 kubenswrapper[15202]: I0319 11:11:26.471560 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-68bf6ff9d6-wshz8_0cb70a30-a8d1-4037-81e6-eb4f0510a234/insights-operator/4.log" Mar 19 11:11:27.426171 master-0 kubenswrapper[15202]: I0319 11:11:27.426101 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jkh97_5cc321ea-4a7a-440d-a58c-a9d141f87363/controller/0.log" Mar 19 11:11:27.434691 master-0 kubenswrapper[15202]: I0319 11:11:27.434632 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jkh97_5cc321ea-4a7a-440d-a58c-a9d141f87363/kube-rbac-proxy/0.log" Mar 19 11:11:27.457279 master-0 kubenswrapper[15202]: I0319 11:11:27.457208 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/controller/0.log" Mar 19 11:11:28.691987 master-0 kubenswrapper[15202]: I0319 11:11:28.691932 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/frr/0.log" Mar 19 11:11:28.701994 master-0 kubenswrapper[15202]: I0319 11:11:28.701950 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/reloader/0.log" Mar 19 11:11:28.706814 master-0 kubenswrapper[15202]: I0319 11:11:28.706788 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/frr-metrics/0.log" Mar 19 11:11:28.718798 master-0 kubenswrapper[15202]: I0319 11:11:28.718288 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/kube-rbac-proxy/0.log" Mar 19 11:11:28.727325 master-0 kubenswrapper[15202]: I0319 11:11:28.726979 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/kube-rbac-proxy-frr/0.log" Mar 19 11:11:28.741781 master-0 kubenswrapper[15202]: I0319 11:11:28.741719 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/cp-frr-files/0.log" Mar 19 11:11:28.746762 master-0 kubenswrapper[15202]: I0319 11:11:28.746720 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/cp-reloader/0.log" Mar 19 11:11:28.756654 master-0 kubenswrapper[15202]: I0319 11:11:28.756592 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dttqv_b9d34e98-54a4-4e3b-ae50-92832b3dce0b/cp-metrics/0.log" Mar 19 11:11:28.769661 master-0 kubenswrapper[15202]: I0319 11:11:28.769582 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-sfpc9_fb00ada9-e047-47a7-82b0-44a3a66d6669/frr-k8s-webhook-server/0.log" Mar 19 11:11:28.771784 master-0 kubenswrapper[15202]: I0319 11:11:28.771758 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/alertmanager/0.log" Mar 19 11:11:28.795452 master-0 kubenswrapper[15202]: I0319 11:11:28.795413 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/config-reloader/0.log" Mar 19 11:11:28.796759 master-0 kubenswrapper[15202]: I0319 11:11:28.796723 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d7b76b756-hw274_52d0ba98-8db8-45ec-b212-8bec41dac138/manager/0.log" Mar 19 11:11:28.807552 master-0 kubenswrapper[15202]: I0319 11:11:28.807458 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-754b74fdf5-vvbj2_87dc574f-d263-420b-9029-edc87ea6c142/webhook-server/0.log" Mar 19 11:11:28.817711 master-0 kubenswrapper[15202]: I0319 11:11:28.817652 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/kube-rbac-proxy-web/0.log" Mar 19 11:11:28.838545 master-0 kubenswrapper[15202]: I0319 11:11:28.837629 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/kube-rbac-proxy/0.log" Mar 19 11:11:28.866501 master-0 kubenswrapper[15202]: I0319 11:11:28.865285 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/kube-rbac-proxy-metric/0.log" Mar 19 11:11:28.888725 master-0 kubenswrapper[15202]: I0319 11:11:28.888627 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/prom-label-proxy/0.log" Mar 19 11:11:28.918542 master-0 kubenswrapper[15202]: I0319 11:11:28.918481 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80f89d04-6a07-4e86-b211-273789da32f2/init-config-reloader/0.log" Mar 19 11:11:29.084609 master-0 kubenswrapper[15202]: I0319 11:11:29.084471 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-58845fbb57-z2869_7ad3ef11-90df-40b1-acbf-ed9b0c708ddb/cluster-monitoring-operator/0.log" Mar 19 11:11:29.108503 master-0 kubenswrapper[15202]: I0319 11:11:29.108081 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-vjbnk_90ebca14-2ef4-4875-a682-48d7cc6fdd63/kube-state-metrics/0.log" Mar 19 11:11:29.112494 master-0 kubenswrapper[15202]: I0319 11:11:29.111810 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv_763180d5-9e68-4e72-ad58-157a402e51eb/extract/0.log" Mar 19 11:11:29.127515 master-0 kubenswrapper[15202]: I0319 11:11:29.127434 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv_763180d5-9e68-4e72-ad58-157a402e51eb/util/0.log" Mar 19 11:11:29.133550 master-0 kubenswrapper[15202]: I0319 11:11:29.133449 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-vjbnk_90ebca14-2ef4-4875-a682-48d7cc6fdd63/kube-rbac-proxy-main/0.log" Mar 19 11:11:29.143504 master-0 kubenswrapper[15202]: I0319 11:11:29.141432 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514cxhrrv_763180d5-9e68-4e72-ad58-157a402e51eb/pull/0.log" Mar 19 11:11:29.149089 master-0 kubenswrapper[15202]: I0319 11:11:29.149030 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7bbc969446-vjbnk_90ebca14-2ef4-4875-a682-48d7cc6fdd63/kube-rbac-proxy-self/0.log" Mar 19 11:11:29.164383 master-0 kubenswrapper[15202]: I0319 11:11:29.164306 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-j929h_7a63b024-b47d-4e28-b8df-db50a3c95bed/manager/0.log" Mar 19 11:11:29.185773 master-0 kubenswrapper[15202]: I0319 11:11:29.185729 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8c858dd9d-j8mx9_752fcbfa-1386-4b68-ac42-5ace89d63908/metrics-server/0.log" Mar 19 11:11:29.198437 master-0 kubenswrapper[15202]: I0319 11:11:29.197403 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jkzd2_f2a48cff-3780-4bd2-b12c-e6b77a990d8b/speaker/0.log" Mar 19 11:11:29.207935 master-0 kubenswrapper[15202]: I0319 11:11:29.207892 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jkzd2_f2a48cff-3780-4bd2-b12c-e6b77a990d8b/kube-rbac-proxy/0.log" Mar 19 11:11:29.219035 master-0 kubenswrapper[15202]: I0319 11:11:29.218987 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5d7d9df6f8-qwngc_15566e56-f6ea-4628-87cd-c6151735cea3/monitoring-plugin/0.log" Mar 19 11:11:29.251086 master-0 kubenswrapper[15202]: I0319 11:11:29.251037 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fxzb9_9a8f8ced-6f9c-44ec-885d-da84f0ae27ae/node-exporter/0.log" Mar 19 11:11:29.286908 master-0 kubenswrapper[15202]: I0319 11:11:29.286854 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fxzb9_9a8f8ced-6f9c-44ec-885d-da84f0ae27ae/kube-rbac-proxy/0.log" Mar 19 11:11:29.311064 master-0 kubenswrapper[15202]: I0319 11:11:29.311020 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fxzb9_9a8f8ced-6f9c-44ec-885d-da84f0ae27ae/init-textfile/0.log" Mar 19 11:11:29.351079 master-0 kubenswrapper[15202]: I0319 11:11:29.350969 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-gh4px_feb06c2f-79d5-4c1d-a8da-8db82de9b2f9/kube-rbac-proxy-main/0.log" Mar 19 11:11:29.379817 master-0 kubenswrapper[15202]: I0319 11:11:29.379515 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-gh4px_feb06c2f-79d5-4c1d-a8da-8db82de9b2f9/kube-rbac-proxy-self/0.log" Mar 19 11:11:29.411832 master-0 kubenswrapper[15202]: I0319 11:11:29.411779 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5dc6c74576-gh4px_feb06c2f-79d5-4c1d-a8da-8db82de9b2f9/openshift-state-metrics/0.log" Mar 19 11:11:29.495027 master-0 kubenswrapper[15202]: I0319 11:11:29.494964 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07fbef0-2fa8-4240-8b80-0c96f3ca53c7/prometheus/0.log" Mar 19 11:11:29.527090 master-0 kubenswrapper[15202]: I0319 11:11:29.527028 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07fbef0-2fa8-4240-8b80-0c96f3ca53c7/config-reloader/0.log" Mar 19 11:11:29.561748 master-0 kubenswrapper[15202]: I0319 11:11:29.561652 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07fbef0-2fa8-4240-8b80-0c96f3ca53c7/thanos-sidecar/0.log" Mar 19 11:11:29.588182 master-0 kubenswrapper[15202]: I0319 11:11:29.588108 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07fbef0-2fa8-4240-8b80-0c96f3ca53c7/kube-rbac-proxy-web/0.log" Mar 19 11:11:29.636212 master-0 kubenswrapper[15202]: I0319 11:11:29.636115 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07fbef0-2fa8-4240-8b80-0c96f3ca53c7/kube-rbac-proxy/0.log" Mar 19 11:11:29.669030 master-0 kubenswrapper[15202]: I0319 11:11:29.668981 15202 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c07fbef0-2fa8-4240-8b80-0c96f3ca53c7/kube-rbac-proxy-thanos/0.log"